AI FRAMEWORKS

Vercel AI SDK

The TypeScript toolkit for building AI-powered applications and agents — with unified APIs for 20+ LLM providers, streaming UI primitives, structured output generation, and seamless integration with React Server Components and Next.js.

Why It Matters

Building AI features without a framework means writing custom streaming logic, managing provider APIs, and building agent loops from scratch. The Vercel AI SDK provides a single interface to 20+ providers, production-ready streaming, and built-in agent orchestration — translating to faster time-to-market and the flexibility to switch providers without rewriting code.

What It Actually Does

Every capability explained in plain English — so you know exactly how Vercel AI SDK translates into features your users see and value your business gains.

Unified Provider Interface

One SDK to access OpenAI, Anthropic, Google, Meta, xAI, Mistral, DeepSeek, Amazon Bedrock, and 20+ more providers. Switch models with a single line of code — no rewrites, no new dependencies.

What This Means For Your Business

Think of it like a universal remote for AI. Instead of buying a separate remote for every TV brand, you get one remote that controls them all. Your team can try different AI models (GPT, Claude, Gemini, etc.) without starting over each time — and if one provider raises prices or goes down, you switch instantly.

Real-Time Streaming

Built-in streaming with streamText() and streamUI() that delivers responses token-by-token as they're generated. Includes stream resumability, custom data streaming, and multiple transport protocols.

What This Means For Your Business

Instead of your users staring at a loading spinner for 10-30 seconds waiting for the AI to finish thinking, they see the response appear word-by-word in real time — just like ChatGPT. This makes your product feel fast and responsive, even when the AI is processing complex requests.

Generative UI (Widgets in Chat)

AI responses aren't limited to text. The SDK can render interactive React components — weather cards, stock tickers, booking forms, data tables — directly in chat based on what the AI decides to show.

What This Means For Your Business

Your AI assistant doesn't just type back answers — it can show interactive widgets right in the conversation. Ask about the weather? It shows a weather card. Ask about a product? It shows a product card with images and a buy button. It's like giving your AI the ability to build mini-apps on the fly during a conversation.

Tool Calling & Actions

Define tools (functions) that the AI can decide to call — search the web, query your database, send emails, process payments. Three types supported: custom tools, provider-defined tools, and provider-executed tools.

What This Means For Your Business

AI by itself can only generate text. But with tools, your AI can actually DO things — look up a customer's order, check inventory, book a meeting, or search the web for real-time information. You define what actions are allowed, and the AI decides when to use them based on what the user asks.

Human-in-the-Loop

Pause AI agent workflows at critical decision points to require human approval before proceeding. Built into the agent loop control with configurable stopping conditions via stopWhen and prepareStep.

What This Means For Your Business

For sensitive actions — like approving a refund, sending an email to a client, or making a purchase — the AI pauses and asks a human for approval before continuing. You stay in control of the important decisions while the AI handles the routine work.

Agent Framework

Built-in ToolLoopAgent class for building autonomous agents that use tools in a loop. Supports multi-step workflows, subagents, memory management, loop control with configurable stopping conditions, and structured workflow patterns.

What This Means For Your Business

An agent is like giving the AI a to-do list and letting it figure out the steps. Instead of answering one question, the AI can break a complex task into steps, use different tools along the way, and keep working until the job is done — like a virtual employee that follows a workflow.

Structured Data Generation

Generate validated, typed JSON objects from AI responses using Zod, Valibot, or JSON Schema. Supports streaming structured data with Output.object() and Output.array().

What This Means For Your Business

When your AI extracts information from documents — like pulling names, dates, and amounts from an invoice — Vercel AI SDK ensures the data comes back clean and organised, not as messy text that someone has to manually sort through. Every field is validated automatically.

Step-by-Step Progress & Status

Multi-step tool calls with full visibility into each step. Stream tool call states (input-available, output-available, output-error) so the UI can show progress for each action the AI is performing.

What This Means For Your Business

When your AI is working through a multi-step task — researching a topic, then summarising, then formatting — your users can see exactly what step it's on and what it's doing. No more black-box waiting. It's like tracking a delivery in real time.

Multi-Framework Support

Framework-agnostic hooks for React, Next.js, Vue, Nuxt, Svelte, SvelteKit, Angular, SolidJS, and Node.js. Mobile support via Expo. Same API surface across all frameworks.

What This Means For Your Business

Whatever technology your product is built with — whether it's a website, a mobile app, or a backend service — the AI SDK works with it. Your team doesn't need to learn a new tool or rewrite existing code to add AI features.

MCP (Model Context Protocol) Support

Native support for the Model Context Protocol — an open standard for connecting AI models to external tool registries, data sources, and MCP servers. Access thousands of pre-built integrations through MCP.

What This Means For Your Business

MCP is a new universal standard that lets AI connect to thousands of external tools and data sources — like Slack, Google Drive, databases, and more — through a single protocol. It's like USB-C for AI: one connector, everything plugs in.

Language Model Middleware

Intercept and modify AI model calls with middleware layers — add caching, guardrails, logging, RAG (retrieval-augmented generation), rate limiting, and custom logic without changing your core application code.

What This Means For Your Business

Want to add safety filters, caching to reduce costs, or logging to track what your AI is doing? Instead of rewriting your AI logic, you just plug in a middleware layer — like adding a filter to the pipeline. It's modular and clean.

Beyond Text: Images, Audio, Video

Generate images, transcribe audio, synthesise speech, and generate video using the same unified API. Supports multimodal inputs (images + text) for richer AI interactions.

What This Means For Your Business

The same toolkit that powers your chatbot can also generate images, turn speech into text, or create audio responses — all through one consistent interface. You're not limited to text-only AI features.

Why Teams Choose Vercel AI SDK

The key advantages that make Vercel AI SDK the go-to choice for building AI-powered products.

Ship AI Features in Days, Not Months

Pre-built hooks (useChat, useCompletion, useObject), streaming infrastructure, agent orchestration, and generative UI — all out of the box. Your team stops building plumbing and starts building product. What used to take 4-8 weeks of infrastructure work now takes days.

Zero Provider Lock-In

Switch from GPT to Claude to Gemini with a single line change. The unified interface means your entire application logic stays the same regardless of which AI model you use — so you can always negotiate better rates or adopt better models without engineering effort.

Best-in-Class Developer Experience

TypeScript-first with full type safety, autocomplete, and compile-time validation. Zod schema integration for structured outputs. Extensive documentation with 15+ official starter templates. 75K+ GitHub stars and a massive community for support.

100% Open Source (Apache 2.0)

The entire SDK is open source under the Apache 2.0 license — free to use, modify, and distribute commercially. No usage fees, no seat licenses, no hidden costs on the framework itself. You own your implementation fully.

Massive Tool Ecosystem

250+ pre-built tool integrations via Composio, Stripe, Slack, GitHub, and more. MCP support for thousands of additional tools. Tool Registry for discovering and sharing community tools. Your AI can connect to virtually any external service.

Built-In Observability & Telemetry

OpenTelemetry support for tracing, logging, and monitoring AI calls. Integrations with Langfuse, LangSmith, Helicone, Braintrust, and 14+ observability platforms. Debug, monitor, and optimise your AI features in production.

Supported AI Providers

Vercel AI SDK connects to 22+ AI providers through a single, unified interface. Switch between providers without changing your application code.

Official Providers (19)

Built-in
OpenAI
Anthropic
Google Generative AI
Google Vertex AI
xAI (Grok)
Amazon Bedrock
Azure OpenAI
Mistral
DeepSeek
Groq
Cohere
Fireworks
Perplexity
Together.ai
Cerebras
DeepInfra
Fal AI
Luma AI
Baseten

Community Providers (3)

Community
Replicate
Ollama
LM Studio

Use Case Fit

See how Vercel AI SDK aligns with different AI product use cases — from chatbots and agents to content generation and workflow automation.

AI Chatbot
Strong Fit
Customer Support Agent
Strong Fit
Internal Knowledge Base
Strong Fit
Content Generation
Strong Fit
Code Assistant
Good Fit
Workflow Automation
Strong Fit
Generative UI
Strong Fit
Semantic Search
Good Fit
Data Extraction
Strong Fit
Multi-Step Agent
Strong Fit

Companion Services

Official services that extend Vercel AI SDK's capabilities in production.

Vercel AI Gateway

Official

One Key. Hundreds of Models. Zero Markup.

A unified API gateway that connects you to 20+ AI providers through a single endpoint. Handles authentication, routing, fallbacks, spend monitoring, and model discovery — with zero markup on token costs.

What This Means For Your Business

Instead of signing up for accounts with OpenAI, Anthropic, Google, and every other AI provider separately — managing different API keys, different billing dashboards, different pricing — AI Gateway gives you one account, one API key, and one dashboard for all of them. If a provider goes down, it automatically switches to a backup. If one provider is cheaper for your use case, you route there with a single setting. It's like having a single power strip that connects all your devices, instead of running separate extension cords to every outlet.

Key Benefits

Single API key for 20+ AI providers — no separate accounts needed
Zero markup on token costs — you pay provider prices directly
Automatic failover — if one provider goes down, traffic reroutes instantly
Model fallbacks — define backup models that activate automatically
Spend monitoring — see costs per user, per feature, per provider in one dashboard
Bring Your Own Key (BYOK) — use your existing provider accounts through the gateway
Dynamic model discovery — programmatically see all available models and their pricing
Built-in web search tools (Perplexity, Parallel) — no additional setup
Zero data retention option — route only to providers that don't store your data
Usage tracking with tags — attribute costs to features, teams, or users

Developer Experience

What your engineering team gets when they adopt Vercel AI SDK — language support, tooling, documentation, and community.

Primary LanguageTypeScript
Documentationexcellent
Starter Templates15+
Community75K+ GitHub stars, 1.5M+ weekly npm downloads

Supported Frameworks

React
Next.js
Vue
Nuxt
Svelte
SvelteKit
Angular
SolidJS
Node.js
Expo (Mobile)
Install
npm install ai

Package managers: npm / pnpm / yarn / bun

Dev Features
TypeScript-First
Hot-Reload Compatible

Local Development

Install via npm, works immediately in any Node.js or frontend environment. Hot-reload compatible with all major dev servers. Local model support via Ollama and LM Studio providers.

Full Type Safety · Compile-Time Validation · IDE Autocomplete

Honest Trade-Offs

No technology is perfect. Here are the real limitations of Vercel AI SDK — so you make an informed decision, not a surprised one.

TypeScript/JavaScript OnlyHigh

The SDK is built exclusively for the TypeScript/JavaScript ecosystem. If your backend is Python, Go, Java, or Ruby, you'd need to use a different AI framework or build a TypeScript service layer that bridges to your main application.

Best Experience Is on Vercel PlatformMedium

While the SDK works anywhere Node.js runs, some features — like AI Gateway's automatic OIDC authentication, seamless deployment, and edge function support — work best when deployed on Vercel's platform. Self-hosted deployments require more configuration.

Rapid Release CadenceMedium

The SDK evolves quickly (now at v6) with occasional breaking changes between major versions. Teams need to stay current with migration guides, which adds maintenance overhead — though the team provides detailed upgrade paths.

Agent Patterns Still MaturingMedium

While the ToolLoopAgent and workflow patterns are powerful, the agent ecosystem (memory persistence, long-running agents, multi-agent coordination) is still evolving compared to more established Python-based agent frameworks like LangChain or CrewAI.

AI Gateway Has Token CostsLow

While the SDK itself is free, the AI Gateway requires Vercel credits for token usage (pass-through costs with zero markup). Teams must still budget for the underlying AI model costs, which vary significantly by provider and model.

Build with Vercel AI SDK? Let's Talk.

Our team will help you architect, build, and ship AI-powered features using Vercel AI SDK — tailored to your product and use case.