Bring Transcription
to Vercel AI SDK
Learn how to connect Speechnotes to Vercel AI SDK and start using 12 AI agent tools in minutes. Fully managed, enterprise secure, and ready to use without writing a single line of code.
What is the Speechnotes MCP Server?
Connect your Speechnotes account to any AI agent to automate your professional audio transcription and speech-to-text orchestration. Speechnotes provides a high-accuracy AI engine for converting audio files into text, and this integration allows you to initiate transcription jobs from URLs, monitor progress, and export results through natural conversation.
What you can do
- Transcription Orchestration — Initiate new transcription jobs from audio URLs and retrieve real-time status updates programmatically.
- Job & History Lifecycle Management — List all past transcription jobs and retrieve detailed metadata, including timestamps and speaker counts directly from the AI interface.
- Export & Format Control — Retrieve transcribed text in multiple formats (TXT, DOCX, SRT) and manage file exports via simple AI commands.
- Language & Model Intelligence — Access available transcription languages and AI models to ensure your results are optimized for your specific content.
- Operational Monitoring — Check your account credits, monitor usage statistics, and manage webhooks to ensure your transcription pipeline is always synchronized.
How it works
1. Subscribe to this server
2. Enter your Speechnotes API Key and API Secret from your developer dashboard
3. Start generating high-accuracy transcriptions from Claude, Cursor, or any MCP-compatible client
Who is this for?
- Journalists & Content Creators — quickly transcribe interviews and podcasts without switching apps.
- Researchers — automate the retrieval of meeting transcriptions and monitor processing progress via natural conversation.
- Operations Teams — streamline the export of transcribed text and monitor account credits directly within the chat.
Built-in capabilities (12)
Sign payload
Check account balance
Export result format
Check job progress
Check usage logs
Get delivery endpoints
Get language codes
List past jobs
Get engine models
Delete job record
Check connection
Transcribe remote file
Why Vercel AI SDK?
The Vercel AI SDK gives every Speechnotes tool full TypeScript type inference, IDE autocomplete, and compile-time error checking. Connect 12 tools through Vinkius and stream results progressively to React, Svelte, or Vue components. works on Edge Functions, Cloudflare Workers, and any Node.js runtime.
- —
TypeScript-first: every MCP tool gets full type inference, IDE autocomplete, and compile-time error checking out of the box
- —
Framework-agnostic core works with Next.js, Nuxt, SvelteKit, or any Node.js runtime. same Speechnotes integration everywhere
- —
Built-in streaming UI primitives let you display Speechnotes tool results progressively in React, Svelte, or Vue components
- —
Edge-compatible: the AI SDK runs on Vercel Edge Functions, Cloudflare Workers, and other edge runtimes for minimal latency
Speechnotes in Vercel AI SDK
Speechnotes and 3,400+ other MCP servers. One platform. One governance layer.
Teams that connect Speechnotes to Vercel AI SDK through Vinkius don't need to source, host, or maintain individual MCP servers. Every tool call runs inside a hardened runtime with credential isolation, DLP, and a signed audit chain.
Raw MCP | Vinkius | |
|---|---|---|
| Server catalog | Find and host yourself | 3,400+ managed |
| Infrastructure | Self-hosted | Sandboxed V8 isolates |
| Credential handling | Plaintext in config | Vault + runtime injection |
| Data loss prevention | None | Configurable DLP policies |
| Kill switch | None | Global instant shutdown |
| Financial circuit breakers | None | Per-server limits + alerts |
| Audit trail | None | Ed25519 signed logs |
| SIEM log streaming | None | Splunk, Datadog, Webhook |
| Honeytokens | None | Canary alerts on leak |
| Custom domains | Not applicable | DNS challenge verified |
| GDPR compliance | Manual effort | Automated purge + export |
Why teams choose Vinkius for Speechnotes in Vercel AI SDK
The Speechnotes MCP Server runs on Vinkius-managed infrastructure inside AWS — a purpose-built runtime with per-request V8 isolates, Ed25519 signed audit chains, and sub-40ms cold starts. All 12 tools execute in hardened sandboxes optimized for native MCP execution.
Your AI agents in Vercel AI SDK only access the data you authorize, with DLP that blocks sensitive information from ever reaching the model, kill switch for instant shutdown, and up to 60% token savings. Enterprise-grade infrastructure, zero maintenance.

* Every MCP server runs on Vinkius-managed infrastructure inside AWS - a purpose-built runtime with per-request V8 isolates, Ed25519 signed audit chains, and sub-40ms cold starts optimized for native MCP execution. See our infrastructure
How Vinkius secures
Speechnotes for Vercel AI SDK
Every tool call from Vercel AI SDK to the Speechnotes MCP Server is protected by DLP redaction, cryptographic audit chains, V8 sandbox isolation, kill switch, and financial circuit breakers.
Frequently asked questions
How do I find my Speechnotes API credentials?
Log in to your Speechnotes account and navigate to the API or developer section in your dashboard to find your unique API Key and API Secret.
How does the Vercel AI SDK connect to MCP servers?
Import createMCPClient from @ai-sdk/mcp and pass the server URL. The SDK discovers all tools and provides typed TypeScript interfaces for each one.
Can I use MCP tools in Edge Functions?
Yes. The AI SDK is fully edge-compatible. MCP connections work on Vercel Edge Functions, Cloudflare Workers, and similar runtimes.
Does it support streaming tool results?
Yes. The SDK provides streaming primitives like useChat and streamText that handle tool calls and display results progressively in the UI.
createMCPClient is not a function
Install: npm install @ai-sdk/mcp
