Technology

Building AI Agents: From Streaming Chatbots to LangGraph with MCP

A hands-on Next.js project that walks you through building AI agents step by step — from a simple streaming chatbot all the way to a LangGraph-powered agent with database tools served over MCP.

20 min read
Published

Complete Tutorial Code

Follow along with the complete source code for this AI agent tutorial. Includes five progressive tabs from a basic streaming chatbot to a LangGraph-powered agent with MCP tools.

View on GitHub

Introduction

Building AI agents has never been more accessible, yet the landscape of tools and frameworks can be overwhelming. This tutorial takes a progressive approach — each tab in the UI corresponds to a more advanced API route, letting you see exactly how the complexity grows and what capabilities each layer adds.

You'll start with a simple streaming chatbot backed by SQLite memory, then add inline tool-calling, move those tools to an MCP server, and finally explore LangChain and LangGraph as a powerful alternative to the Vercel AI SDK.

The Database

Before diving into agents, it helps to understand the data they work with. The app uses a single in-memory SQLite database (via better-sqlite3) created once when the Node.js process starts.

Business Tables

inventory — products, prices, stock
customers — names, emails, cities
sales — transactions linking inventory & customers

Internal Tables

chat_sessions — active sessions by UUID
chat_messages — full conversation history
tool_calls — logs every SQL tool call

Tutorial Overview

The tutorial is structured as five progressive tabs, each building on the previous concepts while introducing new capabilities:

1

🤖 Basic AI Agent

A streaming chatbot with persistent memory. Uses the Vercel AI SDK's streamText and manually saves every message to SQLite so the full conversation history is re-sent to the model on each turn.

POST /api/chatVercel AI SDKSQLite Memory
2

🛠️ AI Agent with Tools

Adds inline tool-calling so the model can query or modify the database. Three tools (one per table) are registered with Zod-validated SQL inputs. The AI SDK handles the agentic loop automatically.

POST /api/chat-with-toolstool()stopWhen: stepCountIs(10)
3

⚡ AI Agent with MCP

The same tools from Tab 2, now served over the Model Context Protocol (MCP). The agent discovers tools dynamically via an MCP client, making them reusable across any MCP-compatible client.

POST /api/chat-with-mcp@ai-sdk/mcpmcp-handler
4

🦜 Basic LangChain Agent

Introduces LangChain and LangGraph as an alternative to the Vercel AI SDK. A simple chatbot (no tools) that uses LangGraph's SqliteSaver checkpointer for automatic conversation memory.

POST /api/chat-with-langchain@langchain/langgraphSqliteSaver
5

🕸️ LangGraph with Tools

The most advanced tab. Combines LangGraph's StateGraph with database tools, giving you full control over the agent's reasoning loop as an explicit graph with conditional edges.

POST /api/chat-with-langgraphStateGraphConditional Edges

Prerequisites and Setup

Before diving into the examples, ensure you have the necessary tools and environment configured:

Requirements

  • Node.js (v18 or higher)
  • OpenAI API key
  • TypeScript / Next.js knowledge

Installation Steps

  1. 1
    Clone the repository:
    git clone https://github.com/audoir/ai-agent-tutorial.git
  2. 2
    Install dependencies:
    npm install
  3. 3
    Configure environment:
    OPENAI_API_KEY=sk-...

    Create a .env.local file with your OpenAI API key

  4. 4
    Start the dev server:
    npm run dev

    Open http://localhost:3000 in your browser.

Tab 1: Basic AI Agent

The simplest possible AI agent: a streaming chatbot with persistent memory. The client sends { prompt, sessionId } to POST /api/chat. The server loads the full conversation history from SQLite, calls streamText() with GPT-4o-mini, and saves the assistant reply back to the DB on finish.

// lib/chat-session.ts
export function initChatSession(db, sessionId, prompt) {
  // 1. Create session if new
  // 2. Save user message
  // 3. Return full history for context window
}

// app/api/chat/route.ts
const result = await streamText({
  model: openai("gpt-4o-mini"),
  messages: history,
  onFinish: ({ text }) => saveAssistantMessage(db, sessionId, text),
});
return result.toUIMessageStreamResponse();

Memory is implemented manually — every message is written to chat_messages and the entire history is re-sent to the model on each turn. This is the simplest form of stateful chat.

Tab 2: AI Agent with Tools

This agent can call functions (tools) to query or modify the database. Three tools are registered — one per business table — each accepting a Zod-validated SQL payload.

// lib/sql-tools.ts
export const sqlInputSchema = z.object({
  sql: z.string(),       // e.g. "SELECT * FROM inventory WHERE category = ?"
  params: z.array(...),  // e.g. ["Electronics"]
});

// app/api/chat-with-tools/route.ts
tools: {
  inventory: tool({
    description: TOOL_DESCRIPTIONS.inventory,
    inputSchema: sqlInputSchema,
    execute: makeSqlExecute("inventory", sessionId),
  }),
  customers: tool({ ... }),
  sales: tool({ ... }),
}

The AI SDK handles the agentic loop automatically: the LLM generates a tool call → SDK executes it → result is appended to history → LLM is called again → repeat until a plain text response is produced. stopWhen: stepCountIs(10) prevents infinite loops.

Tab 3: AI Agent with MCP

The same tools from Tab 2, now served over the Model Context Protocol (MCP) — a standard for exposing tools to AI models over HTTP. The agent discovers tools dynamically via an MCP client rather than having them defined inline.

// Architecture
Client → POST /api/chat-with-mcp
           ↓
    createMCPClient connects to /api/mcp/mcp (HTTP transport)
           ↓
    tools = await mcpClient.tools()   ← discovers tools dynamically
           ↓
    streamText(GPT-4o-mini, { tools })
           ↓
    Tool calls routed back through MCP client to /api/mcp/mcp

Inline Tools (Tab 2)

  • • Defined in the same file as the route
  • • Tightly coupled to the agent
  • • No network overhead
  • • Simpler to set up

MCP Tools (Tab 3)

  • • Defined in a separate server
  • • Discoverable by any MCP-compatible client
  • • Communicates over HTTP
  • • Reusable across multiple agents/apps

Tab 4: Basic LangChain Agent

This tab introduces LangChain and LangGraph as an alternative to the Vercel AI SDK. It's a simple chatbot (no tools) that demonstrates LangGraph's built-in checkpointing for conversation memory.

// Reuse the existing SQLite connection for checkpointing
const checkpointer = new SqliteSaver(getDb());

const agent = createAgent({
  model: new ChatOpenAI({ model: "gpt-4o-mini", streaming: true }),
  tools: [],          // no tools — pure chat
  systemPrompt: "...",
  checkpointer,       // LangGraph persists state automatically
});

const eventStream = await agent.streamEvents(
  { messages: [new HumanMessage(prompt)] },
  { configurable: { thread_id: sessionId }, version: "v2" }
);

Instead of manually saving messages (as in Tab 1), LangGraph's SqliteSaver checkpointer automatically persists the full graph state keyed by thread_id. On the next request with the same thread_id, LangGraph restores the state and continues the conversation.

Tab 5: LangGraph with Tools

The most advanced tab. It combines LangGraph's StateGraph with database tools, giving you full control over the agent's reasoning loop as an explicit graph.

// The Graph: START → llmCall → (conditional) → toolNode → llmCall → ... → END
new StateGraph(MessagesState)
  .addNode("llmCall", llmCall)       // calls the LLM
  .addNode("toolNode", toolNode)     // executes tool calls
  .addEdge(START, "llmCall")
  .addConditionalEdges("llmCall", shouldContinue, {
    toolNode: "toolNode",            // if LLM made tool calls → run tools
    [END]: END,                      // if LLM gave a final answer → stop
  })
  .addEdge("toolNode", "llmCall")    // after tools run → back to LLM
  .compile({ checkpointer });

Vercel AI SDK (Tab 2)

  • • Agentic loop handled by the SDK
  • • Less code, less control
  • • Hard to add custom logic between steps
  • • Good for standard tool-calling patterns

LangGraph StateGraph (Tab 5)

  • • You define the loop as a graph
  • • More code, full control
  • • Easy to add nodes (validation, logging, etc.)
  • • Good for complex multi-step workflows

Key Concepts Demonstrated

Conversation Memory

  • • Manual SQLite persistence (Tabs 1 & 2)
  • • LangGraph SqliteSaver checkpointing (Tabs 4 & 5)
  • • Thread-based session continuity

Tool Integration

  • • Zod schema validation for SQL inputs
  • • Inline tools vs. MCP-served tools
  • • Automatic tool invocation and result handling

Model Context Protocol

  • • HTTP transport for tool discovery
  • • Reusable tools across multiple agents
  • • Dynamic tool registration with createMcpHandler

Graph-Based Agents

  • • Explicit StateGraph with nodes and edges
  • • Conditional routing (shouldContinue)
  • • Cyclic graphs for iterative tool use

Key Dependencies

The tutorial uses a modern TypeScript / Next.js stack with carefully selected dependencies:

Vercel AI SDK

  • ai streamText, tool
  • @ai-sdk/openai — OpenAI provider
  • @ai-sdk/mcp — MCP client
  • mcp-handler — MCP server for Next.js

LangChain / LangGraph

  • @langchain/openai — ChatOpenAI
  • @langchain/langgraph — StateGraph
  • @langchain/langgraph-checkpoint-sqlite — SqliteSaver
  • better-sqlite3 — synchronous SQLite

Architecture Summary

Tab 1: Basic AI Agent
  POST /api/chat
  └── streamText (AI SDK) + manual SQLite memory

Tab 2: AI Agent with Tools
  POST /api/chat-with-tools
  └── streamText (AI SDK) + inline tool() definitions + SQLite memory

Tab 3: AI Agent with MCP
  POST /api/chat-with-mcp
  └── streamText (AI SDK) + MCP client → GET/POST /api/mcp/mcp
                                          └── createMcpHandler (mcp-handler)

Tab 4: Basic LangChain Agent
  POST /api/chat-with-langchain
  └── LangChain createAgent + SqliteSaver checkpointer (no tools)

Tab 5: LangGraph with Tools
  POST /api/chat-with-langgraph
  └── LangGraph StateGraph (llmCall ↔ toolNode) + SqliteSaver checkpointer

Learning Outcomes

By working through this tutorial, you will have gained practical experience with:

  • • Building streaming AI chatbots with the Vercel AI SDK
  • • Implementing manual and automatic conversation memory
  • • Adding tool-calling to AI agents with Zod schema validation
  • • Serving tools over the Model Context Protocol (MCP)
  • • Using LangChain and LangGraph as an alternative AI framework
  • • Building explicit agent graphs with conditional routing
  • • Integrating SQLite for both business data and agent state
  • • Comparing trade-offs between different agent architectures

Troubleshooting

Common issues and their solutions:

API Key Issues

Ensure your OpenAI API key is set in .env.local

OPENAI_API_KEY=sk-...

Import Errors

Verify all dependencies are installed

npm install

SQLite Native Module

better-sqlite3 requires a native build. If you see binding errors, run npm rebuild better-sqlite3.

Conclusion

This tutorial demonstrates that there is no single "right" way to build AI agents. The Vercel AI SDK offers a streamlined, low-boilerplate path for standard tool-calling patterns, while LangGraph gives you explicit control over the agent's reasoning loop — ideal for complex, branching workflows.

MCP sits orthogonally to both: it's a protocol for making tools reusable and discoverable, regardless of which agent framework you choose. Together, these five tabs give you a complete picture of the modern AI agent stack.

About the Author

Wayne Cheng is the founder and AI app developer at Audoir, LLC. Prior to founding Audoir, he worked as a hardware design engineer for Silicon Valley startups and an audio engineer for creative organizations. He holds an MSEE from UC Davis and a Music Technology degree from Foothill College.

Further Exploration

Explore the complete tutorial repository and experiment with extending the examples. Consider adding new tools, implementing human-in-the-loop steps, or connecting to external APIs to deepen your understanding of AI agent architectures.

For more AI-powered development tools and tutorials, visit Audoir .