# Advanced LLM Workflows and Orchestration
# Learning Objectives
These topics will be covered in this module’s knowledge check and entry ticket.
- Understand how to use language models (LLMs) in applications by calling an API endpoint.
- Know how RAG can be used to augment generation.
- Explore building multi-step, autonomous processes that leverage LLMs to complete tasks.
- Explore the concept of agents and tools and understand how they can help LLMs provide better responses.
# Lessons
# Calling LLM APIs
Before class on Tuesday please make sure you have a Groq account and try creating a notebook in which you are able to hit the API and get a response back from the model. See the video below for details.
# Retrieval-Augmented Generation (RAG)
# LLM Agents and Tools
# LangChain and LangGraph
You don’t need to learn the implementation details of LangChain / LangGraph, but it’s good to know what they are. You are likely to encounter them if you continue building complex LLM workflows. In-class we won’t use LangChain or LangGraph (because it takes time to understand them and I believe it’s more important to start out learning to interact directly with LLMs before abstracting it all away) but you might consider using it for your own projects!