Linkup (AI Search & RAG) MCP Server for Vercel AI SDK 2 tools — connect in under 2 minutes
The Vercel AI SDK is the TypeScript toolkit for building AI-powered applications. Connect Linkup (AI Search & RAG) through Vinkius and every tool is available as a typed function. ready for React Server Components, API routes, or any Node.js backend.
ASK AI ABOUT THIS MCP SERVER
Vinkius supports streamable HTTP and SSE.
import { createMCPClient } from "@ai-sdk/mcp";
import { generateText } from "ai";
import { openai } from "@ai-sdk/openai";
async function main() {
const mcpClient = await createMCPClient({
transport: {
type: "http",
// Your Vinkius token. get it at cloud.vinkius.com
url: "https://edge.vinkius.com/[YOUR_TOKEN_HERE]/mcp",
},
});
try {
const tools = await mcpClient.tools();
const { text } = await generateText({
model: openai("gpt-4o"),
tools,
prompt: "Using Linkup (AI Search & RAG), list all available capabilities.",
});
console.log(text);
} finally {
await mcpClient.close();
}
}
main();
* Every MCP server runs on Vinkius-managed infrastructure inside AWS - a purpose-built runtime with per-request V8 isolates, Ed25519 signed audit chains, and sub-40ms cold starts optimized for native MCP execution. See our infrastructure
About Linkup (AI Search & RAG) MCP Server
Connect your Linkup account to any AI agent and take full control of real-time web intelligence and content retrieval for RAG pipelines through natural conversation.
The Vercel AI SDK gives every Linkup (AI Search & RAG) tool full TypeScript type inference, IDE autocomplete, and compile-time error checking. Connect 2 tools through Vinkius and stream results progressively to React, Svelte, or Vue components. works on Edge Functions, Cloudflare Workers, and any Node.js runtime.
What you can do
- Semantic Web Search — Execute context-rich queries that return high-relevancy results specifically optimized for Large Language Models directly from your agent
- Deep Content Retrieval — Extract clean, readable text from any web URL, stripping away noise and navigation to feed high-quality grounding data to your AI
- RAG-Ready Payloads — Retrieve structured search results including titles, snippets, and source URLs designed for seamless integration into vector stores
- Precision Extraction — Target specific URLs for content parsing, ensuring your agent has the exact technical context or documentation required for its task
- Real-time Intelligence — Access the latest facts and data from across the internet to ground your agent's answers in up-to-date reality
- Search Breadth — Switch between standard and deep search modes to balance between rapid fact-finding and comprehensive research across the web
The Linkup (AI Search & RAG) MCP Server exposes 2 tools through the Vinkius. Connect it to Vercel AI SDK in under two minutes — no API keys to rotate, no infrastructure to provision, no vendor lock-in. Your configuration, your data, your control.
How to Connect Linkup (AI Search & RAG) to Vercel AI SDK via MCP
Follow these steps to integrate the Linkup (AI Search & RAG) MCP Server with Vercel AI SDK.
Install dependencies
Run npm install @ai-sdk/mcp ai @ai-sdk/openai
Replace the token
Replace [YOUR_TOKEN_HERE] with your Vinkius token
Run the script
Save to agent.ts and run with npx tsx agent.ts
Explore tools
The SDK discovers 2 tools from Linkup (AI Search & RAG) and passes them to the LLM
Why Use Vercel AI SDK with the Linkup (AI Search & RAG) MCP Server
Vercel AI SDK provides unique advantages when paired with Linkup (AI Search & RAG) through the Model Context Protocol.
TypeScript-first: every MCP tool gets full type inference, IDE autocomplete, and compile-time error checking out of the box
Framework-agnostic core works with Next.js, Nuxt, SvelteKit, or any Node.js runtime. same Linkup (AI Search & RAG) integration everywhere
Built-in streaming UI primitives let you display Linkup (AI Search & RAG) tool results progressively in React, Svelte, or Vue components
Edge-compatible: the AI SDK runs on Vercel Edge Functions, Cloudflare Workers, and other edge runtimes for minimal latency
Linkup (AI Search & RAG) + Vercel AI SDK Use Cases
Practical scenarios where Vercel AI SDK combined with the Linkup (AI Search & RAG) MCP Server delivers measurable value.
AI-powered web apps: build dashboards that query Linkup (AI Search & RAG) in real-time and stream results to the UI with zero loading states
API backends: create serverless endpoints that orchestrate Linkup (AI Search & RAG) tools and return structured JSON responses to any frontend
Chatbots with tool use: embed Linkup (AI Search & RAG) capabilities into conversational interfaces with streaming responses and tool call visibility
Internal tools: build admin panels where team members interact with Linkup (AI Search & RAG) through natural language queries
Linkup (AI Search & RAG) MCP Tools for Vercel AI SDK (2)
These 2 tools become available when you connect Linkup (AI Search & RAG) to Vercel AI SDK via MCP:
fetch_url
Bypasses advanced bot protections executing complex SPA JavaScript loops automatically. Fetch and extract clean content from any specific URL using Linkup Platform
search_web
Choose "fast" mapping for basic factual requests and "deep" for thorough research limits. Perform a real-time web search extracting deep answers utilizing Linkup Platform
Example Prompts for Linkup (AI Search & RAG) in Vercel AI SDK
Ready-to-use prompts you can give your Vercel AI SDK agent to start working with Linkup (AI Search & RAG) immediately.
"Search for the latest NVIDIA earnings report summary"
"Extract the technical specifications from this documentation URL: [url]"
"Deep search for 'AI agent security best practices 2024'"
Troubleshooting Linkup (AI Search & RAG) MCP Server with Vercel AI SDK
Common issues when connecting Linkup (AI Search & RAG) to Vercel AI SDK through the Vinkius, and how to resolve them.
createMCPClient is not a function
npm install @ai-sdk/mcpLinkup (AI Search & RAG) + Vercel AI SDK FAQ
Common questions about integrating Linkup (AI Search & RAG) MCP Server with Vercel AI SDK.
How does the Vercel AI SDK connect to MCP servers?
createMCPClient from @ai-sdk/mcp and pass the server URL. The SDK discovers all tools and provides typed TypeScript interfaces for each one.Can I use MCP tools in Edge Functions?
Does it support streaming tool results?
useChat and streamText that handle tool calls and display results progressively in the UI.Connect Linkup (AI Search & RAG) with your favorite client
Step-by-step setup guides for every MCP-compatible client and framework:
Anthropic's native desktop app for Claude with built-in MCP support.
AI-first code editor with integrated LLM-powered coding assistance.
GitHub Copilot in VS Code with Agent mode and MCP support.
Purpose-built IDE for agentic AI coding workflows.
Autonomous AI coding agent that runs inside VS Code.
Anthropic's agentic CLI for terminal-first development.
Python SDK for building production-grade OpenAI agent workflows.
Google's framework for building production AI agents.
Type-safe agent development for Python with first-class MCP support.
TypeScript toolkit for building AI-powered web applications.
TypeScript-native agent framework for modern web stacks.
Python framework for orchestrating collaborative AI agent crews.
Leading Python framework for composable LLM applications.
Data-aware AI agent framework for structured and unstructured sources.
Microsoft's framework for multi-agent collaborative conversations.
Connect Linkup (AI Search & RAG) to Vercel AI SDK
Get your token, paste the configuration, and start using 2 tools in under 2 minutes. No API key management needed.
