Jina AI (Search Foundation & LLM Grounding) MCP Server for VS Code Copilot 6 tools — connect in under 2 minutes
GitHub Copilot in VS Code is the most widely adopted AI coding assistant, embedded directly into the world's most popular code editor. With MCP support in Agent mode, Copilot can access external data and APIs to generate context-aware code grounded in real-time information.
ASK AI ABOUT THIS MCP SERVER
Vinkius supports streamable HTTP and SSE.
Vinkius Desktop App
The modern way to manage MCP Servers — no config files, no terminal commands. Install Jina AI (Search Foundation & LLM Grounding) and 2,500+ MCP Servers from a single visual interface.




{
"mcpServers": {
"jina-ai-search-foundation-llm-grounding": {
"url": "https://edge.vinkius.com/[YOUR_TOKEN_HERE]/mcp"
}
}
}
* Every MCP server runs on Vinkius-managed infrastructure inside AWS - a purpose-built runtime with per-request V8 isolates, Ed25519 signed audit chains, and sub-40ms cold starts optimized for native MCP execution. See our infrastructure
About Jina AI (Search Foundation & LLM Grounding) MCP Server
Connect your Jina AI account to any AI agent and take full control of state-of-the-art search infrastructure and LLM grounding through natural conversation.
GitHub Copilot Agent mode brings Jina AI (Search Foundation & LLM Grounding) data directly into your VS Code workflow. With a project-scoped config, the entire team shares access to 6 tools — Copilot queries live data, generates typed code, and writes tests from actual API responses, all without leaving the editor.
What you can do
- LLM Grounding & Reader — Extract clean, readable Markdown context from any web URL, stripping away noise and navigation to feed high-quality data to your agent
- Semantic Web Search — Perform context-rich web searches that return structured results specifically optimized for RAG pipelines and AI analysis
- Vector Embeddings — Generate high-quality embeddings using Jina's advanced models to power semantic search and document similarity workflows
- Precision Reranking — Improve search relevance by re-ordering candidate documents based on their semantic match to a specific query block
- Zero-Shot Classification — Categorize text inputs against custom labels with confidence scores without training specific models manually
- Intelligent Segmentation — Break down long documents into semantically cohesive chunks to optimize retrieval-augmented generation (RAG)
The Jina AI (Search Foundation & LLM Grounding) MCP Server exposes 6 tools through the Vinkius. Connect it to VS Code Copilot in under two minutes — no API keys to rotate, no infrastructure to provision, no vendor lock-in. Your configuration, your data, your control.
How to Connect Jina AI (Search Foundation & LLM Grounding) to VS Code Copilot via MCP
Follow these steps to integrate the Jina AI (Search Foundation & LLM Grounding) MCP Server with VS Code Copilot.
Create MCP config
Create a .vscode/mcp.json file in your project root
Add the server config
Paste the JSON configuration above
Enable Agent mode
Open GitHub Copilot Chat and switch to Agent mode using the dropdown
Start using Jina AI (Search Foundation & LLM Grounding)
Ask Copilot: "Using Jina AI (Search Foundation & LLM Grounding), help me..." — 6 tools available
Why Use VS Code Copilot with the Jina AI (Search Foundation & LLM Grounding) MCP Server
GitHub Copilot for Visual Studio Code provides unique advantages when paired with Jina AI (Search Foundation & LLM Grounding) through the Model Context Protocol.
VS Code is used by over 70% of developers — adding MCP tools to Copilot means your team can leverage external data without leaving their primary editor
Project-scoped MCP configs (`.vscode/mcp.json`) let you commit server configurations to your repository, ensuring the entire team shares the same tool access
Copilot's Agent mode integrates MCP tools seamlessly with file editing, terminal commands, and workspace search in a single agentic loop
GitHub's enterprise compliance and audit features extend to MCP tool usage, providing visibility into how AI interacts with external services
Jina AI (Search Foundation & LLM Grounding) + VS Code Copilot Use Cases
Practical scenarios where VS Code Copilot combined with the Jina AI (Search Foundation & LLM Grounding) MCP Server delivers measurable value.
Live API integration: Copilot can query an MCP server, inspect the response schema, and generate typed API client code in the same step
DevSecOps workflows: security teams can give developers access to domain intelligence tools directly in their editor for real-time vulnerability assessment during code review
Data pipeline development: Copilot fetches sample data via MCP and generates transformation scripts, validators, and test fixtures from actual API responses
Documentation generation: Copilot queries available tools and auto-generates README sections, API reference docs, and usage examples
Jina AI (Search Foundation & LLM Grounding) MCP Tools for VS Code Copilot (6)
These 6 tools become available when you connect Jina AI (Search Foundation & LLM Grounding) to VS Code Copilot via MCP:
classify_texts
Perform zero-shot text classification
generate_embeddings
The input must be a JSON array of strings. Generate vector embeddings from text
read_url_content
Excellent for grounding LLMs with live web content. Read and extract clean text from a URL
rerank_documents
Rerank search documents against a query
search_web_jina
Returns context-rich structured search results, suitable for RAG pipelines. Perform a semantic web search
segment_content
Semantically segment and chunk long text content
Example Prompts for Jina AI (Search Foundation & LLM Grounding) in VS Code Copilot
Ready-to-use prompts you can give your VS Code Copilot agent to start working with Jina AI (Search Foundation & LLM Grounding) immediately.
"Extract the main content from 'https://jina.ai/embeddings' as Markdown"
"Search the web for the latest updates on 'DeepSeek-V3 architecture'"
"Segment this long text into semantically cohesive chunks: [text content]"
Troubleshooting Jina AI (Search Foundation & LLM Grounding) MCP Server with VS Code Copilot
Common issues when connecting Jina AI (Search Foundation & LLM Grounding) to VS Code Copilot through the Vinkius, and how to resolve them.
MCP tools not available
Jina AI (Search Foundation & LLM Grounding) + VS Code Copilot FAQ
Common questions about integrating Jina AI (Search Foundation & LLM Grounding) MCP Server with VS Code Copilot.
Which VS Code version supports MCP?
How do I switch to Agent mode?
Can I restrict which MCP tools Copilot can access?
Does MCP work in VS Code Remote or Codespaces?
.vscode/mcp.json work in Remote SSH, WSL, and GitHub Codespaces environments. The MCP connection is established from the remote host, so ensure the server URL is accessible from that environment.Connect Jina AI (Search Foundation & LLM Grounding) with your favorite client
Step-by-step setup guides for every MCP-compatible client and framework:
Anthropic's native desktop app for Claude with built-in MCP support.
AI-first code editor with integrated LLM-powered coding assistance.
GitHub Copilot in VS Code with Agent mode and MCP support.
Purpose-built IDE for agentic AI coding workflows.
Autonomous AI coding agent that runs inside VS Code.
Anthropic's agentic CLI for terminal-first development.
Python SDK for building production-grade OpenAI agent workflows.
Google's framework for building production AI agents.
Type-safe agent development for Python with first-class MCP support.
TypeScript toolkit for building AI-powered web applications.
TypeScript-native agent framework for modern web stacks.
Python framework for orchestrating collaborative AI agent crews.
Leading Python framework for composable LLM applications.
Data-aware AI agent framework for structured and unstructured sources.
Microsoft's framework for multi-agent collaborative conversations.
Connect Jina AI (Search Foundation & LLM Grounding) to VS Code Copilot
Get your token, paste the configuration, and start using 6 tools in under 2 minutes. No API key management needed.
