AI Agents with LangGraph

Nov 12 2024 · Python 3.12, LangGraph 0.2.x, JupyterLab 4.2.4

Lesson 02: Fundamentals of LangGraph

Project Demo

Episode complete

Play next episode

Next

Heads up... You’re accessing parts of this content for free, with some sections shown as obfuscated text.

Heads up... You’re accessing parts of this content for free, with some sections shown as obfuscated text.

Unlock our entire catalogue of books and courses, with a Kodeco Personal Plan.

Unlock now

Project Demo

In lesson 1, you started an AI Agent project to localize app strings from English into Spanish. So far, all that it does is translate a few words. You won’t add any functionality to your agent today, but you’ll modify it to use LangGraph.

def extract(user_input):
prompt = """Analyze if the user is asking for a translation.
  If so, respond with only the text to translate.
  Do not translate the text yourself. Otherwise, respond normally.
  For example, if the user says 'How do you say hello in Spanish?'
  you should respond 'hello' """
completion = llm.chat.completions.create(
  model="gpt-4o",
  messages=[
    {"role": "system", "content": prompt},
    {"role": "user", "content": user_input}
  ]
)

return completion.choices[0].message.content
from langgraph.graph import Graph

graph = Graph()

graph.add_node("extractor", extract)
graph.add_node("translator", translate)

graph.add_edge("extractor", "translator")

graph.set_entry_point("extractor")
graph.set_finish_point("translator")

app = graph.compile()
user_input = "How do you say goodbye in Spanish?"
result = app.invoke(user_input)
print(result)
See forum comments
Cinema mode Download course materials from Github
Previous: State Demo Next: Conclusion