Find Your People. Build Your Future.
DEVSA is the official San Antonio community partner for Zero to Agent—a global 10-day initiative designed to take you from idea to deployed agent using the full power of the Vercel AI SDK and AI Gateway.
Every track runs on the same core stack
AI SDK
The unified TypeScript library for streaming LLM calls, tool use, and multi-step agent reasoning — works with any model provider.
AI Gateway
One API key for OpenAI, Anthropic, Google, and more. Built-in fallbacks, usage tracking, and zero extra config.
+ Vercel Deploy
Build the UI in v0, deploy to Vercel in one click. Every git push creates a shareable preview — no infra to manage.
Each track layers a specialist tool on top — MCP for real-world data connections, the Workflow SDK for durable long-running agents, or the Chat SDK to deploy to Slack, Discord, WhatsApp, and more.
Choose Your Track
Three specialized tracks. Pick the one that fits your vision — each is designed to get a working agent shipped in a 2-hour session.
Vercel Workflow (WDK)
Build a "durable" agent that survives crashes, resumes after deploys, and can sleep for days — then wake up exactly when needed. These aren't chatbots. They're persistent workflows that run autonomously over hours, days, or months.
- →Stateful Slack bot — listens for @mentions, remembers conversation context across restarts, and escalates unanswered threads to a human reviewer after a timeout. Clone the working guide and swap in your own logic.
- →Multi-step booking agent — user requests a flight (or appointment, or reservation), the agent searches options, sleeps while awaiting confirmation, then resumes to complete the booking. Based on the vercel/workflow-examples flight-booking-app.
- →Human-in-the-loop approvals — agent scans transactions or requests, flags anything over a threshold, then suspends and waits indefinitely for a human approval before writing the result. Uses WDK's native suspend/resume primitives.
- 1.
Scaffold a Next.js app
npx create-next-app@latest --no-src-dir
- 2.
Add WDK
npx workflow@latest
- 3.
Wrap your Next.js config
export default withWorkflow(nextConfig)
- 4.
Create workflow functions with "use workflow" and steps with "use step"
- 5.
Use DurableAgent from @workflow/ai/agent for AI agent workflows — LLM calls become retryable steps automatically
- 6.
Get a Gateway API key from Vercel AI Gateway — one key for OpenAI, Anthropic, Google, and more
- 7.
Inspect every step, input, output, and sleep in the Vercel dashboard under Observability → Workflows — no extra setup needed
- 8.
Deploy to Vercel — queues, persistence, and routing are auto-provisioned
The UI Breakthrough ( + MCP)
Use v0 by Vercel to spin up a professional interface connected to your real-world data via Model Context Protocol. Connect verified MCP servers for GitHub, Vercel, or Notion — or explore Figma's MCP server, which exposes your designs to AI agents (v0 can connect via MCP adapter).
v0 has its own MCP server at mcp.v0.dev. Add it to Cursor, Claude Desktop, or VS Code and trigger v0 generations without leaving your IDE — no tab switching, no copy-paste. Use it as your build companion for this track.
Set it up →- →Sprint reporter — GitHub MCP reads your open PRs and closed issues from the past week; Notion MCP writes a structured release note or sprint summary directly into your workspace. v0 scaffolds a one-click 'Generate Report' UI with a preview pane. Two MCP servers, one interface, fully automated docs.
- →Figma-to-code explorer — point the Figma MCP server at a design file and let v0 read your components, tokens, and frames to generate React code. Use the MCP adapter in v0.
- 1.
Open v0.app and describe the app you want to build — be specific about what data it needs and what actions it should take
- 2.
Iterate on the UI and logic with natural language prompts until the interface looks right
- 3.
Add AI features with the AI SDK — v0 scaffolds the integration for you, no manual wiring
- 4.
Connect an MCP server following v0's MCP adapter docs at v0.app/docs/api/platform/adapters/mcp-server
- 5.
Deploy to Vercel with one click — MCP connections persist as environment variables, no infra to manage
- 6.
To build your own custom MCP server, follow the Vercel deployment guide at vercel.com/docs/mcp/deploy-mcp-servers-to-vercel
ChatSDK Agents
Write your bot logic once with the Chat SDK and deploy to 15+ platforms — Slack, Discord, Teams, WhatsApp, Telegram, iMessage, GitHub, Linear, and more — via swappable adapters. The SDK handles event routing, streaming, JSX cards, and distributed state. Pair with the AI SDK for LLM reasoning and the AI Gateway for zero-config access to any model provider.
- →Slack Q&A bot grounded in your team docs — answers questions in-thread, cites the source page, and escalates to a human after a timeout. Uses bot.onNewMention and thread.post() with an AI SDK text stream.
- →GitHub PR summarizer — posts a plain-English summary of the diff as a comment the moment a PR is opened. Wire up the GitHub adapter, call the AI Gateway with the diff as context, post back via thread.post().
- →WhatsApp vision support bot — user sends a photo (screenshot, receipt, product image); the WhatsApp adapter downloads the media, pipes it to a vision model via AI Gateway, and replies with interactive button cards ('Save', 'Create ticket', 'Dismiss'). Uses @chat-adapter/whatsapp media downloads and the Card component.
- 1.
Install the Chat skill for your coding agent
npx skills add vercel/chat — gives Claude Code, Cursor, or your agent full SDK context and best practices
- 2.
Install
npm install chat @chat-adapter/slack @chat-adapter/discord
- 3.
Pick a state adapter
@chat-adapter/state-redis or @chat-adapter/state-postgres (production) or state-memory (dev)
- 4.
Create a Chat instance with your adapters and state
- 5.
Wire up handlers
bot.onNewMention, bot.onSubscribedMessage, bot.onReaction
- 6.
Add AI — thread.post() accepts AI SDK text streams natively
- 7.
Use the AI Gateway for zero-config access to any model provider
- 8.
Deploy to Vercel — adapters auto-detect credentials from env vars
Pro Tips
Use the AI Gateway
Skip managing individual API keys. One endpoint for OpenAI, Anthropic, Google, and more with built-in fallbacks.
Add the Vercel Plugin + Skills
If you're using a coding agent (Claude Code, Cursor, etc.), add the Vercel plugin skill to your project for best-practice guidance.
Feed llms.txt to your LLM
The full AI SDK docs are available as one Markdown file at ai-sdk.dev/llms.txt. Feed it to your LLM for accurate, up-to-date code generation.
Start from a template
Clone the Chatbot template, Knowledge Agent template, or a WDK example and customize.
Use AI SDK DevTools
AI SDK 6 has built-in DevTools for debugging multi-step agent flows. Full visibility into LLM calls, tool use, and trajectories.
Deploy early, iterate fast
Push to Vercel after your first working feature. Every git push creates a preview deployment you can share with teammates and judges.
Why You Should Be There
$6K+
Global Prize Pool
Compete for a piece of the $6,000+ global prize pool, including Vercel Platform and Pro credits.
Zero to Agent
Limited Edition Swag
Score limited edition Zero to Agent swag available only for this event.
Invitation to the 2nd Annual AI Agent Showcase
Want to showcase the agent you built? Sign up for the 2nd Annual AI Agent Showcase, powered by the Geeks&& family—a science fair-style event where you can present your agent to the San Antonio tech community on May 2nd.
Sign up for the ShowcaseReference
Start Here
Open these three tabs right now — they cover 90% of what you need tonight.
Turn “I have an idea” into “I just shipped it.”
Come join the active builders in San Antonio and find your breakthrough shipping real AI agents with and Vercel.
New to ? Sign up here for an extra $5 in credits.
Questions? Hit up the Vercel Community