Jina AI (Search Foundation & LLM Grounding) MCP Server for Vercel AI SDK 6 tools — connect in under 2 minutes
The Vercel AI SDK is the TypeScript toolkit for building AI-powered applications. Connect Jina AI (Search Foundation & LLM Grounding) through the Vinkius and every tool is available as a typed function — ready for React Server Components, API routes, or any Node.js backend.
ASK AI ABOUT THIS MCP SERVER
Vinkius supports streamable HTTP and SSE.
import { createMCPClient } from "@ai-sdk/mcp";
import { generateText } from "ai";
import { openai } from "@ai-sdk/openai";
async function main() {
const mcpClient = await createMCPClient({
transport: {
type: "http",
// Your Vinkius token — get it at cloud.vinkius.com
url: "https://edge.vinkius.com/[YOUR_TOKEN_HERE]/mcp",
},
});
try {
const tools = await mcpClient.tools();
const { text } = await generateText({
model: openai("gpt-4o"),
tools,
prompt: "Using Jina AI (Search Foundation & LLM Grounding), list all available capabilities.",
});
console.log(text);
} finally {
await mcpClient.close();
}
}
main();
* Every MCP server runs on Vinkius-managed infrastructure inside AWS - a purpose-built runtime with per-request V8 isolates, Ed25519 signed audit chains, and sub-40ms cold starts optimized for native MCP execution. See our infrastructure
About Jina AI (Search Foundation & LLM Grounding) MCP Server
Connect your Jina AI account to any AI agent and take full control of state-of-the-art search infrastructure and LLM grounding through natural conversation.
The Vercel AI SDK gives every Jina AI (Search Foundation & LLM Grounding) tool full TypeScript type inference, IDE autocomplete, and compile-time error checking. Connect 6 tools through the Vinkius and stream results progressively to React, Svelte, or Vue components — works on Edge Functions, Cloudflare Workers, and any Node.js runtime.
What you can do
- LLM Grounding & Reader — Extract clean, readable Markdown context from any web URL, stripping away noise and navigation to feed high-quality data to your agent
- Semantic Web Search — Perform context-rich web searches that return structured results specifically optimized for RAG pipelines and AI analysis
- Vector Embeddings — Generate high-quality embeddings using Jina's advanced models to power semantic search and document similarity workflows
- Precision Reranking — Improve search relevance by re-ordering candidate documents based on their semantic match to a specific query block
- Zero-Shot Classification — Categorize text inputs against custom labels with confidence scores without training specific models manually
- Intelligent Segmentation — Break down long documents into semantically cohesive chunks to optimize retrieval-augmented generation (RAG)
The Jina AI (Search Foundation & LLM Grounding) MCP Server exposes 6 tools through the Vinkius. Connect it to Vercel AI SDK in under two minutes — no API keys to rotate, no infrastructure to provision, no vendor lock-in. Your configuration, your data, your control.
How to Connect Jina AI (Search Foundation & LLM Grounding) to Vercel AI SDK via MCP
Follow these steps to integrate the Jina AI (Search Foundation & LLM Grounding) MCP Server with Vercel AI SDK.
Install dependencies
Run npm install @ai-sdk/mcp ai @ai-sdk/openai
Replace the token
Replace [YOUR_TOKEN_HERE] with your Vinkius token
Run the script
Save to agent.ts and run with npx tsx agent.ts
Explore tools
The SDK discovers 6 tools from Jina AI (Search Foundation & LLM Grounding) and passes them to the LLM
Why Use Vercel AI SDK with the Jina AI (Search Foundation & LLM Grounding) MCP Server
Vercel AI SDK provides unique advantages when paired with Jina AI (Search Foundation & LLM Grounding) through the Model Context Protocol.
TypeScript-first: every MCP tool gets full type inference, IDE autocomplete, and compile-time error checking out of the box
Framework-agnostic core works with Next.js, Nuxt, SvelteKit, or any Node.js runtime — same Jina AI (Search Foundation & LLM Grounding) integration everywhere
Built-in streaming UI primitives let you display Jina AI (Search Foundation & LLM Grounding) tool results progressively in React, Svelte, or Vue components
Edge-compatible: the AI SDK runs on Vercel Edge Functions, Cloudflare Workers, and other edge runtimes for minimal latency
Jina AI (Search Foundation & LLM Grounding) + Vercel AI SDK Use Cases
Practical scenarios where Vercel AI SDK combined with the Jina AI (Search Foundation & LLM Grounding) MCP Server delivers measurable value.
AI-powered web apps: build dashboards that query Jina AI (Search Foundation & LLM Grounding) in real-time and stream results to the UI with zero loading states
API backends: create serverless endpoints that orchestrate Jina AI (Search Foundation & LLM Grounding) tools and return structured JSON responses to any frontend
Chatbots with tool use: embed Jina AI (Search Foundation & LLM Grounding) capabilities into conversational interfaces with streaming responses and tool call visibility
Internal tools: build admin panels where team members interact with Jina AI (Search Foundation & LLM Grounding) through natural language queries
Jina AI (Search Foundation & LLM Grounding) MCP Tools for Vercel AI SDK (6)
These 6 tools become available when you connect Jina AI (Search Foundation & LLM Grounding) to Vercel AI SDK via MCP:
classify_texts
Perform zero-shot text classification
generate_embeddings
The input must be a JSON array of strings. Generate vector embeddings from text
read_url_content
Excellent for grounding LLMs with live web content. Read and extract clean text from a URL
rerank_documents
Rerank search documents against a query
search_web_jina
Returns context-rich structured search results, suitable for RAG pipelines. Perform a semantic web search
segment_content
Semantically segment and chunk long text content
Example Prompts for Jina AI (Search Foundation & LLM Grounding) in Vercel AI SDK
Ready-to-use prompts you can give your Vercel AI SDK agent to start working with Jina AI (Search Foundation & LLM Grounding) immediately.
"Extract the main content from 'https://jina.ai/embeddings' as Markdown"
"Search the web for the latest updates on 'DeepSeek-V3 architecture'"
"Segment this long text into semantically cohesive chunks: [text content]"
Troubleshooting Jina AI (Search Foundation & LLM Grounding) MCP Server with Vercel AI SDK
Common issues when connecting Jina AI (Search Foundation & LLM Grounding) to Vercel AI SDK through the Vinkius, and how to resolve them.
createMCPClient is not a function
npm install @ai-sdk/mcpJina AI (Search Foundation & LLM Grounding) + Vercel AI SDK FAQ
Common questions about integrating Jina AI (Search Foundation & LLM Grounding) MCP Server with Vercel AI SDK.
How does the Vercel AI SDK connect to MCP servers?
createMCPClient from @ai-sdk/mcp and pass the server URL. The SDK discovers all tools and provides typed TypeScript interfaces for each one.Can I use MCP tools in Edge Functions?
Does it support streaming tool results?
useChat and streamText that handle tool calls and display results progressively in the UI.Connect Jina AI (Search Foundation & LLM Grounding) with your favorite client
Step-by-step setup guides for every MCP-compatible client and framework:
Anthropic's native desktop app for Claude with built-in MCP support.
AI-first code editor with integrated LLM-powered coding assistance.
GitHub Copilot in VS Code with Agent mode and MCP support.
Purpose-built IDE for agentic AI coding workflows.
Autonomous AI coding agent that runs inside VS Code.
Anthropic's agentic CLI for terminal-first development.
Python SDK for building production-grade OpenAI agent workflows.
Google's framework for building production AI agents.
Type-safe agent development for Python with first-class MCP support.
TypeScript toolkit for building AI-powered web applications.
TypeScript-native agent framework for modern web stacks.
Python framework for orchestrating collaborative AI agent crews.
Leading Python framework for composable LLM applications.
Data-aware AI agent framework for structured and unstructured sources.
Microsoft's framework for multi-agent collaborative conversations.
Connect Jina AI (Search Foundation & LLM Grounding) to Vercel AI SDK
Get your token, paste the configuration, and start using 6 tools in under 2 minutes. No API key management needed.
