Building AI Agents with LangChain and LangGraph
A comprehensive guide to creating intelligent AI agents using LangChain and LangGraph. From basic tool integration to advanced graph-based workflows, learn how to build production-ready AI applications with TypeScript.
Complete Tutorial Code
Follow along with the complete source code for this LangChain tutorial. Includes four progressive examples from basic agents to production-ready implementations.
View on GitHubIntroduction
LangChain has revolutionized how we build AI applications by providing a comprehensive framework for creating intelligent agents that can interact with external tools, maintain conversation state, and execute complex workflows. Combined with LangGraph, it enables developers to build sophisticated AI systems with graph-based architectures.
This tutorial takes you through a progressive journey from basic agent creation to advanced graph-based workflows. You'll learn to build AI agents that can use tools, manage state, and handle complex decision-making processes using TypeScript and modern development practices.
What is LangChain?
LangChain is a framework designed to simplify the creation of applications using large language models (LLMs). It provides abstractions and tools that make it easier to build complex AI applications with features like tool integration, memory management, and chain composition.
Traditional AI Apps
Direct API callsManual prompt engineeringCustom state managementComplex setup with manual orchestration
LangChain
const agent = createReactAgent({
llm: new ChatOpenAI(),
tools: [weatherTool],
prompt: ChatPromptTemplate.fromMessages([
["system", "You are a helpful assistant"],
["placeholder", "{chat_history}"],
["human", "{input}"],
["placeholder", "{agent_scratchpad}"],
])
});Declarative agent creation with built-in capabilities
Understanding LangGraph
LangGraph extends LangChain by providing a graph-based approach to building AI workflows. It allows you to create complex, stateful applications where different components can be connected in flexible patterns, enabling conditional logic, loops, and parallel processing.
┌─────────────────┐ ┌──────────────────┐ ┌─────────────────┐
│ User Input │───▶│ State Graph │───▶│ LLM + Tools │
└─────────────────┘ └──────────────────┘ └─────────────────┘
│ │
▼ ▼
┌──────────────────┐ ┌─────────────────┐
│ Conditional │ │ Tool Results │
│ Routing Logic │◀───│ Processing │
└──────────────────┘ └─────────────────┘
Tutorial Overview
This tutorial consists of four progressive examples, each building upon the previous concepts while introducing new capabilities:
Basic Overview
Introduction to LangChain fundamentals with simple tool integration and agent creation using GPT-3.5-turbo.
LangGraph Hello World
Introduction to graph-based workflows with basic state management and linear execution patterns.
Advanced Graph API
Complex graph implementation with conditional routing, multiple tools, and advanced state management patterns.
Real-World Agent
Production-ready implementation with persistent memory, structured outputs, and personalized responses.
Prerequisites and Setup
Before diving into the examples, ensure you have the necessary tools and environment configured:
Requirements
- Node.js (v16 or higher)
- OpenAI API key
- TypeScript knowledge
Installation Steps
- 1Clone the repository:
git clone https://github.com/audoir/langchain-tutorial.git - 2Install dependencies:
npm install - 3Configure environment:
OPENAI_API_KEY=your_api_key_hereCreate a .env file with your OpenAI API key
Example 1: Basic Agent Overview
The first example demonstrates fundamental LangChain concepts by creating a simple agent with tool integration. This example shows how to define tools using Zod schema validation and create a basic agent that can invoke these tools.
import { tool } from "@langchain/core/tools";
import { z } from "zod";
import { ChatOpenAI } from "@langchain/openai";
// Define a weather tool with Zod validation
const weatherTool = tool(
async ({ location }) => {
return `The weather in ${location} is sunny and 75°F`;
},
{
name: "get_weather",
description: "Get the current weather for a location",
schema: z.object({
location: z.string().describe("The location to get weather for"),
}),
}
);
// Create and invoke the agent
const model = new ChatOpenAI({
model: "gpt-3.5-turbo",
temperature: 0,
}).bindTools([weatherTool]);
const response = await model.invoke([
{ role: "user", content: "What's the weather like in San Francisco?" }
]);Run this example with:
npm run overviewExample 2: LangGraph Fundamentals
The second example introduces LangGraph with a simple "Hello World" implementation. It demonstrates basic state management and linear workflow execution using a mock LLM.
import { StateGraph, MessagesAnnotation, START, END } from "@langchain/langgraph";
// Create a simple graph with linear workflow
const workflow = new StateGraph(MessagesAnnotation)
.addNode("mock_llm", async (state) => {
return {
messages: [
...state.messages,
{ role: "assistant", content: "Hello from LangGraph!" }
]
};
})
.addEdge(START, "mock_llm")
.addEdge("mock_llm", END);
const app = workflow.compile();
// Execute the graph
const result = await app.invoke({
messages: [{ role: "user", content: "Hello!" }]
});This example demonstrates the fundamental graph pattern: START → mock_llm → END, with message state management throughout the workflow.
Example 3: Advanced Graph Patterns
The third example showcases complex graph implementation with conditional routing, multiple tools, and custom state management. It features mathematical operations tools and dynamic workflow decisions.
// Custom state with additional tracking
const GraphState = MessagesAnnotation.spec({
llmCalls: {
reducer: (x, y) => (x ?? 0) + (y ?? 0),
default: () => 0,
},
});
// Mathematical tools
const addTool = tool(/* ... */);
const multiplyTool = tool(/* ... */);
const divideTool = tool(/* ... */);
// Conditional routing logic
const shouldContinue = (state) => {
const lastMessage = state.messages[state.messages.length - 1];
return lastMessage.tool_calls?.length > 0 ? "tools" : END;
};
// Build complex graph with conditional edges
const workflow = new StateGraph(GraphState)
.addNode("llmCall", llmNode)
.addNode("tools", toolNode)
.addEdge(START, "llmCall")
.addConditionalEdges("llmCall", shouldContinue)
.addEdge("tools", "llmCall");This example demonstrates how to create cyclic graphs with conditional logic, allowing the agent to make decisions based on the current state and continue processing until completion.
Example 4: Production-Ready Agent
The final example presents a production-ready agent implementation with persistent memory, structured outputs, and personalized responses. It includes weather tools and user context management.
import { MemorySaver } from "@langchain/langgraph";
// Structured output schema
const WeatherResponse = z.object({
location: z.string(),
temperature: z.number(),
condition: z.string(),
pun: z.string().describe("A weather-related pun"),
});
// Create agent with memory persistence
const memory = new MemorySaver();
const app = workflow.compile({ checkpointer: memory });
// Invoke with user context and thread ID
const result = await app.invoke(
{
messages: [{ role: "user", content: "What's the weather?" }],
userContext: { location: "San Francisco", name: "Alice" }
},
{ configurable: { thread_id: "user-123" } }
);This implementation includes thread-based conversation continuity, allowing the agent to maintain context across multiple interactions while providing structured, type-safe responses.
Key Concepts Demonstrated
State Management
- • MessagesAnnotation for built-in message handling
- • Custom state extensions with additional fields
- • Memory persistence with thread-based continuity
Tool Integration
- • Tool definition with Zod schemas
- • Tool binding to language models
- • Automatic tool invocation and result handling
Graph Patterns
- • Linear graphs for sequential workflows
- • Conditional graphs with dynamic routing
- • Cyclic graphs for iterative processing
Advanced Features
- • Structured outputs with type safety
- • Runtime context for user-specific data
- • Error handling and state validation
Dependencies and Architecture
The tutorial uses a modern TypeScript stack with carefully selected dependencies for optimal development experience:
Core Dependencies
@langchain/core- Core functionality@langchain/langgraph- Graph workflows@langchain/openai- OpenAI integrationzod- Schema validation
Development Tools
tsx- TypeScript executiondotenv- Environment variables@types/node- Node.js types
Running the Examples
Each example can be run independently using the provided npm scripts. Follow the recommended learning path for the best experience:
- 1Basic concepts:
npm run overview - 2Graph fundamentals:
npm run langraph-hello - 3Complex workflows:
npm run langraph-graph-api - 4Production implementation:
npm run real-world-agent
Learning Outcomes
By completing this tutorial, you will have gained practical experience with:
- • Creating AI agents with LangChain and tool integration
- • Building graph-based workflows with LangGraph
- • Implementing state management and memory persistence
- • Designing conditional logic and dynamic routing
- • Creating production-ready agents with structured outputs
- • Using TypeScript for type-safe AI application development
- • Integrating external APIs and tools into AI workflows
- • Managing conversation context and user personalization
Troubleshooting
Common issues and their solutions:
API Key Issues
Ensure your OpenAI API key is properly set in the .env file
OPENAI_API_KEY=sk-...Import Errors
Verify all dependencies are installed
npm installTypeScript Errors
Check that @types/node is installed for proper type definitions
Conclusion
LangChain and LangGraph provide powerful abstractions for building sophisticated AI applications. By progressing through these examples, you've learned how to create agents that can use tools, manage state, and execute complex workflows with conditional logic and memory persistence.
The patterns demonstrated in this tutorial form the foundation for building production-ready AI applications. From simple tool integration to complex graph-based workflows, these concepts enable you to create intelligent systems that can handle real-world use cases with reliability and scalability.
About the Author
Wayne Cheng is the founder and AI app developer at Audoir, LLC. Prior to founding Audoir, he worked as a hardware design engineer for Silicon Valley startups and an audio engineer for creative organizations. He holds an MSEE from UC Davis and a Music Technology degree from Foothill College.
Further Exploration
To continue your LangChain journey, explore the complete tutorial repository and experiment with extending the examples. Consider adding features like multi-agent systems, custom tools, or integration with external APIs to deepen your understanding of LangChain's capabilities.
For more AI-powered development tools and tutorials, visit Audoir .