AI Agents with LangGraph

Nov 12 2024 · Python 3.12, LangGraph 0.2.x, JupyterLab 4.2.4

Lesson 01: Introduction to AI Agents & Function Calling

AI Agent Translation Demo

Episode complete

Play next episode

Next
Transcript

Open the Starter project for Lesson 1. This module will use the python-dotenv library to set the OpenAI API key. Make sure it’s installed by running:

pip install python-dotenv

You need to have a .env file in the root of your project with your OpenAI API key included like this:

OPENAI_API_KEY=<your-api-key>

Replace what’s on the right of the equals sign with your actual API key. Then import python-dotenv in your JupyterLab project, and load your API key like so:

from dotenv import load_dotenv
load_dotenv()

Run that, and you’ll see True as the output. Set up your OpenAI LLM next. This lesson assumes you’ve already installed the Python OpenAI library.

import os
from openai import OpenAI

llm = OpenAI(api_key=os.getenv("OPENAI_API_KEY"))

Next, you’ll define the function that your AI Agent will call. Since you’re making a translation agent, call your function translate:

def translate(text):

There are many different ways you could implement a function like this. Using the Google Translate API would be a good choice. However, it’s quite complicated to set up, so today you’ll use some hard-coded values:

if text == "hello":
  return "hola"
elif text == "goodbye":
  return "adiós"
else:
  return "UNKNOWN"

If the text is hello, you’ll return hola. If the text is goodbye, you’ll return adiós. And if it’s neither of those, you’ll return UNKNOWN. You’re assuming the user wants to translate everything into Spanish.

Now it’s time to write your AI Agent. In this lesson, you’ll simply put it in a function:

def ai_agent(user_input):

Then, write a prompt for the LLM:

prompt = """Analyze if the user is asking for a translation.
  If so, respond with 'TRANSLATE:text' where `text` is the text to translate.
  Do not translate the text yourself. Otherwise, respond normally.
  For example, if the user says 'How do you say hello in Spanish?'
  you should respond 'TRANSLATE:hello' """

This prompt will guide the model to use structured output when the situation is appropriate to call a function. Next, call the OpenAI API:

completion = llm.chat.completions.create(
  model="gpt-4o",
  messages=[
    {"role": "system", "content": prompt},
    {"role": "user", "content": user_input}
  ]
)

Here, you use the gpt-4o model. You also pass in two messages: the system prompt that tells the model how to respond and the user input.

Extract the response from the completion:

response = completion.choices[0].message.content

Finally, add the return value:

if response.startswith("TRANSLATE:"):
  parts = response.split(":", 2)
  if len(parts) == 2:
    text = parts[1]
    return translate(text)
return response

For strings that start with TRANSLATE:, you extract the text by splitting the string into at most two parts and then taking the second one, the part after the colon, as the text to translate. This is passed as a parameter to the translate function.

Time to test it out. Write:

user_input = "How do you say goodbye in Spanish?"
result = ai_agent(user_input)
print(result)

Run that, and you’ll see “adiós”. Change “goodbye” to “hello”, and you’ll see “hola”. Now change it to “save”, and you’ll see “UNKNOWN”.

Great, it’s working! Well, maybe it’s not working enough to translate all your app strings, but you’ve built an AI Agent that can call a function.

See forum comments
Cinema mode Download course materials from Github
Previous: AI Agents Next: Conclusion