Pinecone MCP Server for Vercel AI SDK 7 tools — connect in under 2 minutes
The Vercel AI SDK is the TypeScript toolkit for building AI-powered applications. Connect Pinecone through the Vinkius and every tool is available as a typed function — ready for React Server Components, API routes, or any Node.js backend.
ASK AI ABOUT THIS MCP SERVER
Vinkius supports streamable HTTP and SSE.
import { createMCPClient } from "@ai-sdk/mcp";
import { generateText } from "ai";
import { openai } from "@ai-sdk/openai";
async function main() {
const mcpClient = await createMCPClient({
transport: {
type: "http",
// Your Vinkius token — get it at cloud.vinkius.com
url: "https://edge.vinkius.com/[YOUR_TOKEN_HERE]/mcp",
},
});
try {
const tools = await mcpClient.tools();
const { text } = await generateText({
model: openai("gpt-4o"),
tools,
prompt: "Using Pinecone, list all available capabilities.",
});
console.log(text);
} finally {
await mcpClient.close();
}
}
main();
* Every MCP server runs on Vinkius-managed infrastructure inside AWS - a purpose-built runtime with per-request V8 isolates, Ed25519 signed audit chains, and sub-40ms cold starts optimized for native MCP execution. See our infrastructure
About Pinecone MCP Server
Connect your Pinecone knowledge graph environment straight into your AI agent's logic. Give your preferred Large Language Model the keys to fetch, query, and modify vector spaces via natural language context without leaving the chat interface.
The Vercel AI SDK gives every Pinecone tool full TypeScript type inference, IDE autocomplete, and compile-time error checking. Connect 7 tools through the Vinkius and stream results progressively to React, Svelte, or Vue components — works on Edge Functions, Cloudflare Workers, and any Node.js runtime.
What you can do
- Index Hierarchy — Retrieve structural blueprints instantly using
list_indexesand fetch intricate topology parameters utilizingdescribe_index. - Semantic Harvesting — Pass pure array values to execute blazing-fast retrieval with
query_vectors, or pinpoint specific embeddings natively employingfetch_vectors. - Space Archiving — Monitor grouped snapshot arrays leveraging
list_collectionsand perform surgical cleanups executingdelete_vectorsaccurately. - Performance Auditing — Ask the model to pull real-time health checks calling
get_index_statsto reveal vector capacity limits across pods.
The Pinecone MCP Server exposes 7 tools through the Vinkius. Connect it to Vercel AI SDK in under two minutes — no API keys to rotate, no infrastructure to provision, no vendor lock-in. Your configuration, your data, your control.
How to Connect Pinecone to Vercel AI SDK via MCP
Follow these steps to integrate the Pinecone MCP Server with Vercel AI SDK.
Install dependencies
Run npm install @ai-sdk/mcp ai @ai-sdk/openai
Replace the token
Replace [YOUR_TOKEN_HERE] with your Vinkius token
Run the script
Save to agent.ts and run with npx tsx agent.ts
Explore tools
The SDK discovers 7 tools from Pinecone and passes them to the LLM
Why Use Vercel AI SDK with the Pinecone MCP Server
Vercel AI SDK provides unique advantages when paired with Pinecone through the Model Context Protocol.
TypeScript-first: every MCP tool gets full type inference, IDE autocomplete, and compile-time error checking out of the box
Framework-agnostic core works with Next.js, Nuxt, SvelteKit, or any Node.js runtime — same Pinecone integration everywhere
Built-in streaming UI primitives let you display Pinecone tool results progressively in React, Svelte, or Vue components
Edge-compatible: the AI SDK runs on Vercel Edge Functions, Cloudflare Workers, and other edge runtimes for minimal latency
Pinecone + Vercel AI SDK Use Cases
Practical scenarios where Vercel AI SDK combined with the Pinecone MCP Server delivers measurable value.
AI-powered web apps: build dashboards that query Pinecone in real-time and stream results to the UI with zero loading states
API backends: create serverless endpoints that orchestrate Pinecone tools and return structured JSON responses to any frontend
Chatbots with tool use: embed Pinecone capabilities into conversational interfaces with streaming responses and tool call visibility
Internal tools: build admin panels where team members interact with Pinecone through natural language queries
Pinecone MCP Tools for Vercel AI SDK (7)
These 7 tools become available when you connect Pinecone to Vercel AI SDK via MCP:
delete_vectors
Delete vectors from an index
describe_index
Get configuration details for an index
fetch_vectors
Fetch specific vectors by their IDs
get_index_stats
Get usage statistics for an index
list_collections
List all index collections
list_indexes
List all Pinecone indexes
query_vectors
Returns the most similar vectors and their metadata. Search for similar vectors
Example Prompts for Pinecone in Vercel AI SDK
Ready-to-use prompts you can give your Vercel AI SDK agent to start working with Pinecone immediately.
"Check the vector count stats for the index named `document-embeddings`."
"Delete all vectors belonging to the user ID 'auth-abc123' namespace."
"List all existing collections created in my Pinecone environment."
Troubleshooting Pinecone MCP Server with Vercel AI SDK
Common issues when connecting Pinecone to Vercel AI SDK through the Vinkius, and how to resolve them.
createMCPClient is not a function
npm install @ai-sdk/mcpPinecone + Vercel AI SDK FAQ
Common questions about integrating Pinecone MCP Server with Vercel AI SDK.
How does the Vercel AI SDK connect to MCP servers?
createMCPClient from @ai-sdk/mcp and pass the server URL. The SDK discovers all tools and provides typed TypeScript interfaces for each one.Can I use MCP tools in Edge Functions?
Does it support streaming tool results?
useChat and streamText that handle tool calls and display results progressively in the UI.Connect Pinecone with your favorite client
Step-by-step setup guides for every MCP-compatible client and framework:
Anthropic's native desktop app for Claude with built-in MCP support.
AI-first code editor with integrated LLM-powered coding assistance.
GitHub Copilot in VS Code with Agent mode and MCP support.
Purpose-built IDE for agentic AI coding workflows.
Autonomous AI coding agent that runs inside VS Code.
Anthropic's agentic CLI for terminal-first development.
Python SDK for building production-grade OpenAI agent workflows.
Google's framework for building production AI agents.
Type-safe agent development for Python with first-class MCP support.
TypeScript toolkit for building AI-powered web applications.
TypeScript-native agent framework for modern web stacks.
Python framework for orchestrating collaborative AI agent crews.
Leading Python framework for composable LLM applications.
Data-aware AI agent framework for structured and unstructured sources.
Microsoft's framework for multi-agent collaborative conversations.
Connect Pinecone to Vercel AI SDK
Get your token, paste the configuration, and start using 7 tools in under 2 minutes. No API key management needed.
