Bring Ci Cd
to Vercel AI SDK
Learn how to connect Azure DevOps to Vercel AI SDK and start using 6 AI agent tools in minutes. Fully managed, enterprise secure, and ready to use without writing a single line of code.
What is the Azure DevOps MCP Server?
Connect your Azure DevOps account to any AI agent and simplify how you manage your software development lifecycle, track work items, and monitor pipelines through natural conversation.
What you can do
- Project Oversight — List all projects in your organization and retrieve detailed metadata and configurations.
- Work Item Tracking — List and query recent tasks, bugs, and user stories to manage your team's backlog.
- Git Repository Control — Query all Git repositories within a project to monitor code storage.
- Pipeline Monitoring — List CI/CD pipelines and retrieve the history of recent build executions and statuses.
- Team Coordination — List project teams to understand organizational structure and distribution.
- Operational Status — Fetch real-time metadata for projects and work items directly via AI commands.
How it works
1. Subscribe to this server
2. Enter your Azure DevOps Organization and Personal Access Token (PAT)
3. Start managing your DevOps ecosystem from Claude, Cursor, or any MCP client
Who is this for?
- Developers & Engineers — quickly check pipeline statuses and verify work item details via simple AI queries.
- DevOps Specialists — monitor build history and manage repositories directly from the workspace.
- Product Owners — get instant bird's-eye views of work item progress and project health via the AI assistant.
Built-in capabilities (6)
List recent builds
List CI/CD pipelines
List teams in a project
List Azure DevOps projects
List Git repositories
List recent work items
Why Vercel AI SDK?
The Vercel AI SDK gives every Azure DevOps tool full TypeScript type inference, IDE autocomplete, and compile-time error checking. Connect 6 tools through Vinkius and stream results progressively to React, Svelte, or Vue components. works on Edge Functions, Cloudflare Workers, and any Node.js runtime.
- —
TypeScript-first: every MCP tool gets full type inference, IDE autocomplete, and compile-time error checking out of the box
- —
Framework-agnostic core works with Next.js, Nuxt, SvelteKit, or any Node.js runtime. same Azure DevOps integration everywhere
- —
Built-in streaming UI primitives let you display Azure DevOps tool results progressively in React, Svelte, or Vue components
- —
Edge-compatible: the AI SDK runs on Vercel Edge Functions, Cloudflare Workers, and other edge runtimes for minimal latency
Azure DevOps in Vercel AI SDK
Azure DevOps and 3,400+ other MCP servers. One platform. One governance layer.
Teams that connect Azure DevOps to Vercel AI SDK through Vinkius don't need to source, host, or maintain individual MCP servers. Every tool call runs inside a hardened runtime with credential isolation, DLP, and a signed audit chain.
Raw MCP | Vinkius | |
|---|---|---|
| Server catalog | Find and host yourself | 3,400+ managed |
| Infrastructure | Self-hosted | Sandboxed V8 isolates |
| Credential handling | Plaintext in config | Vault + runtime injection |
| Data loss prevention | None | Configurable DLP policies |
| Kill switch | None | Global instant shutdown |
| Financial circuit breakers | None | Per-server limits + alerts |
| Audit trail | None | Ed25519 signed logs |
| SIEM log streaming | None | Splunk, Datadog, Webhook |
| Honeytokens | None | Canary alerts on leak |
| Custom domains | Not applicable | DNS challenge verified |
| GDPR compliance | Manual effort | Automated purge + export |
Why teams choose Vinkius for Azure DevOps in Vercel AI SDK
The Azure DevOps MCP Server runs on Vinkius-managed infrastructure inside AWS — a purpose-built runtime with per-request V8 isolates, Ed25519 signed audit chains, and sub-40ms cold starts. All 6 tools execute in hardened sandboxes optimized for native MCP execution.
Your AI agents in Vercel AI SDK only access the data you authorize, with DLP that blocks sensitive information from ever reaching the model, kill switch for instant shutdown, and up to 60% token savings. Enterprise-grade infrastructure, zero maintenance.

* Every MCP server runs on Vinkius-managed infrastructure inside AWS - a purpose-built runtime with per-request V8 isolates, Ed25519 signed audit chains, and sub-40ms cold starts optimized for native MCP execution. See our infrastructure
How Vinkius secures
Azure DevOps for Vercel AI SDK
Every tool call from Vercel AI SDK to the Azure DevOps MCP Server is protected by DLP redaction, cryptographic audit chains, V8 sandbox isolation, kill switch, and financial circuit breakers.
Frequently asked questions
Can I see if a build pipeline failed via the AI?
Yes! Use the list_builds tool and provide the Project ID. Your agent will retrieve the history of recent executions, including their final status (succeeded, failed, inProgress).
How do I list the Git repositories for a project?
Run the list_repositories query with your Project ID. The agent will return all Git repositories associated with that project in your Azure DevOps account.
Is it possible to see recent bugs or tasks assigned to a project?
Absolutely. Use the list_work_items tool. Your agent will retrieve a list of recent work items, including bugs, tasks, and stories, for the specified project.
How does the Vercel AI SDK connect to MCP servers?
Import createMCPClient from @ai-sdk/mcp and pass the server URL. The SDK discovers all tools and provides typed TypeScript interfaces for each one.
Can I use MCP tools in Edge Functions?
Yes. The AI SDK is fully edge-compatible. MCP connections work on Vercel Edge Functions, Cloudflare Workers, and similar runtimes.
Does it support streaming tool results?
Yes. The SDK provides streaming primitives like useChat and streamText that handle tool calls and display results progressively in the UI.
createMCPClient is not a function
Install: npm install @ai-sdk/mcp
