Close Navigation
Learn more about IBKR accounts
How to Build LLM Agents with Magentic

How to Build LLM Agents with Magentic

Posted February 11, 2025 at 1:58 pm

Igor Radovanovic
AlgoTrading101

The post “How to Build LLM Agents with Magentic” first appeared on AlgoTrading101 blog.

The author of this article is not affiliated with Interactive Brokers. This software is in no way affiliated, endorsed, or approved by Interactive Brokers or any of its affiliates. It comes with absolutely no warranty and should not be used in actual trading unless the user can read and understand the source. The IBKR API team does not support this software.

Excerpt

What is Magentic?

Magentic is an open-source framework that allows for the seamless integration of Large Language Models (LLMs) into Python code.

Website: Magentic

GitHub Repository: jackmpcollins/magentic: Seamlessly integrate LLMs as Python functions

Why should I use Magentic?

  • Magentic is easy to use.
  • Magentic is free and open-sourced.
  • It is actively maintained.
  • Plays well with Pydantic to produce structured outputs.
  • Supports vision, streaming, parallel function calling, and more.

Why shouldn’t I use Magentic?

  • Magentic is mainly maintained by just one person.
  • Magentic isn’t the only LLM framework.
  • Doesn’t have a big community around it.

What are some Magentic alternatives?

Magentic alternatives are the following:

Which LLMs does Magentic support?

Magentic supports these LLMs:

Getting started

To get started with Magentic, ensure that you have installed Python and obtained an API key for one of the LLM models mentioned above. I will personally go with ChatGPT as the LLM of choice.

The next step will be to install Magentic into a fresh environment with the following command:

pip install magentic

Now, we will create a .env file in which we will add needed environment variables.

touch .env
MAGENTIC_BACKEND=openai
MAGENTIC_OPENAI_API_KEY=sk-...
MAGENTIC_OPENAI_MODEL=gpt-4

In the following headers, we will go over the basic building blocks of Magentic and create a custom agentic RAG pipeline that will use financial data and function calling to provide outputs to the user query.

Let’s begin.

How to use Magentic prompt decorator?

To use the Magentic  @prompt decorator you need to define a template for a LLM prompt as a Python function. When this function is called, the arguments are inserted into the template and the prompt is sent to an LLM which generates the function output.

from magentic import prompt

@prompt("Explain like I'm five this financial concept: {concept}")
def explain(concept: str) -> str: ...

explain("Subprime mortgage crisis")

“The Subprime Mortgage Crisis is like a big, grown-up version of a game of hot potato. Here’s how it works:

Let’s say you want to buy an ice cream cone, but you don’t have any money. So, a nice person (in this case, a bank) is willing to lend you money to buy your ice cream. But, they know you don’t have any money to pay them back right away, which makes lending the money to you pretty risky. This would be a subprime mortgage, or loan.

Now, imagine that the person lending you money to buy your ice cream gets worried because they know you may not be able to pay them back. To get rid of this problem, they decide to sell your loan to another person (an investor). This means now you owe the money to this new person, not the one who lent you the money in the first place.

This makes the first person feel safe because they got their money back, but now the new person is the one who may lose money if you can’t pay them back.

Then, imagine this happening with not just one ice cream cone, but millions of them. Loads of people can’t pay back their ice cream loans, so the new owners of these loans lose a lot of money. This big problem is like the Subprime Mortgage Crisis.

Importantly, it’s not really about ice cream, but about bigger expensive things like houses. When too many people couldn’t pay back their loans, the banks and investors that owned those loans lost a lot of money, and this caused huge problems for the whole world’s money system.”

Notice how the @prompt decorator plays nicely with the function and curly braces. Moreover, the function doesn’t need to have a body as everything is done by the decorator.

The @prompt decorator will respect the return type annotation of the decorated function. This can be any type supported by pydantic including a pydantic model.

from magentic import prompt
from pydantic import BaseModel


class Portfolio(BaseModel):
    equity_etf_pct: float
    bond_etf_pc: float
    crypto_etf_pc: float
    commodities_pc: float
    reasoning: str


@prompt("Create a strong portfolio of {size} allocation size.")
def create_portfolio(size: str) -> Portfolio: ...

portfolio = create_portfolio("$50,000")
print(portfolio)

equity_etf_pct=50.0
bond_etf_pc=30.0
crypto_etf_pc=10.0
commodities_pc=10.0

reasoning=’A balanced strong portfolio suitable for most risk tolerances would allocate around 50% towards Equity ETFs for growth, 30% towards Bond ETFs for income and stability, 10% towards Crypto ETFs for high-growth and high-risk appetite and 10% towards commodities for a balanced protection against inflation. The allocation size is $50,000.’

How to use Magentic chatprompt decorator?

To use the Magentic chatprompt decorator you will need to pass chat messages as a template rather than a single text prompt. We can also provide a system message or few-shot prompting example responses to guide the model’s output. 

from magentic import chatprompt, AssistantMessage, SystemMessage, UserMessage
from pydantic import BaseModel


class Quote(BaseModel):
    quote: str
    person: str


@chatprompt(
    SystemMessage("You are an avid reader of financial literature."),
    UserMessage("What is your favorite quote from Warren Buffet?"),
    AssistantMessage(
        Quote(
            quote="Price is what you pay; value is what you get.",
            person="Warren Buffet",
        )
    ),
    UserMessage("What is your favorite quote from {person}?"),
)
def get_finance_quote(person: str) -> Quote: ...


get_finance_quote("Charlie Munger")

In my whole life, I have known no wise people (over a broad subject matter area) who didn’t read all the time – none, zero.

How to do LLM function calling with Magentic?

To do LLM function calling with Magentic, you will use a @prompt-decorated function that returns a FunctionCall object which can be called to execute the function using the arguments provided by the LLM. For example, let’s ask for price data and have it use Alpha Vantage:

import os
import requests
from magentic import prompt, FunctionCall

AV_API_KEY = os.getenv("AV_API_KEY")

def get_daily_price(ticker: str, api_key: str = AV_API_KEY) -> dict:
    url = f'https://www.alphavantage.co/query?function=TIME_SERIES_DAILY&symbol={ticker}&apikey={api_key}'
    r = requests.get(url)
    data = r.json()
    return data['Time Series (Daily)']


@prompt(
    "Use the appropriate search function to answer: {question}",
    functions=[get_daily_price],
)
def perform_search(question: str) -> FunctionCall[str]: ...


output = perform_search("What is the daily price data of AAPL?")
output()

>>> {'2025-01-27': {'1. open': '224.1200',
  '2. high': '232.1500',
  '3. low': '224.0000',
  '4. close': '229.8600',
  '5. volume': '94224324'}, ...

Take note that Alpha Vantage has a free tier so grabbing an API key shouldn’t be an issue if you are following along. Alternatively, you can use any other preferred data provider.

How to use Magentic prompt chains?

You will want to use Magentic prompt chains when it is required for the LLM to do several operations before returning the final response. The @prompt_chain decorator will resolve FunctionCall objects automatically and pass the output back to the LLM to continue until the final answer is reached.

import csv
from magentic import prompt_chain


def get_earnings_calendar(ticker: str, api_key: str = AV_API_KEY) -> list:
    url = f"https://www.alphavantage.co/query?function=EARNINGS_CALENDAR&symbol={ticker}&horizon=12month&apikey={api_key}"
    with requests.Session() as s:
        download = s.get(url)
        decoded_content = download.content.decode('utf-8')
        cr = csv.reader(decoded_content.splitlines(), delimiter=',')
        my_list = list(cr)
    return my_list


@prompt_chain(
    "What's {ticker} expected earnings dates for the next 12 months?",
    functions=[get_earnings_calendar],
)
def get_earnings(ticker: str) -> str: ...


get_earnings("IBM")

‘The expected earnings dates for IBM for the fiscal year 2025 are as follows:

– On April 22, 2025 for fiscal date ending March 31, 2025

– On July 22, 2025 for fiscal date ending June 30, 2025

Please note that these dates are for fiscal year 2025 and the exact figures for the earnings are not available yet. These dates are expected and may change. The currency for these earnings is in USD.’

The prompt chains are the “bread and butter” of building agentic workflows as they allow us to create a loop where the LLM can perform multiple API calls until it collects all the information it needs to provide an answer.

With some good prompting techniques and a good pipeline, one can perform many complex tasks.

Visit AlgoTrading101 blog to read the full article and for details on streaming LLM responses with Magentic.

Join The Conversation

For specific platform feedback and suggestions, please submit it directly to our team using these instructions.

If you have an account-specific question or concern, please reach out to Client Services.

We encourage you to look through our FAQs before posting. Your question may already be covered!

Leave a Reply

Disclosure: Interactive Brokers Third Party

Information posted on IBKR Campus that is provided by third-parties does NOT constitute a recommendation that you should contract for the services of that third party. Third-party participants who contribute to IBKR Campus are independent of Interactive Brokers and Interactive Brokers does not make any representations or warranties concerning the services offered, their past or future performance, or the accuracy of the information provided by the third party. Past performance is no guarantee of future results.

This material is from AlgoTrading101 and is being posted with its permission. The views expressed in this material are solely those of the author and/or AlgoTrading101 and Interactive Brokers is not endorsing or recommending any investment or trading discussed in the material. This material is not and should not be construed as an offer to buy or sell any security. It should not be construed as research or investment advice or a recommendation to buy, sell or hold any security or commodity. This material does not and is not intended to take into account the particular financial conditions, investment objectives or requirements of individual customers. Before acting on this material, you should consider whether it is suitable for your particular circumstances and, as necessary, seek professional advice.

Disclosure: ETFs

Any discussion or mention of an ETF is not to be construed as recommendation, promotion or solicitation. All investors should review and consider associated investment risks, charges and expenses of the investment company or fund prior to investing. Before acting on this material, you should consider whether it is suitable for your particular circumstances and, as necessary, seek professional advice.

Disclosure: Digital Assets

Trading in digital assets, including cryptocurrencies, is especially risky and is only for individuals with a high risk tolerance and the financial ability to sustain losses. Eligibility to trade in digital asset products may vary based on jurisdiction.

IBKR Campus Newsletters

This website uses cookies to collect usage information in order to offer a better browsing experience. By browsing this site or by clicking on the "ACCEPT COOKIES" button you accept our Cookie Policy.