LangSmith (LLM Observability & Hub) MCP Server for Windsurf 6 tools — connect in under 2 minutes
Windsurf brings agentic AI coding to a purpose-built IDE. Connect LangSmith (LLM Observability & Hub) through the Vinkius and Cascade will auto-discover every tool — ask questions, generate code, and act on live data without leaving your editor.
ASK AI ABOUT THIS MCP SERVER
Vinkius supports streamable HTTP and SSE.
Vinkius Desktop App
The modern way to manage MCP Servers — no config files, no terminal commands. Install LangSmith (LLM Observability & Hub) and 2,500+ MCP Servers from a single visual interface.




{
"mcpServers": {
"langsmith-llm-observability-hub": {
"url": "https://edge.vinkius.com/[YOUR_TOKEN_HERE]/mcp"
}
}
}
* Every MCP server runs on Vinkius-managed infrastructure inside AWS - a purpose-built runtime with per-request V8 isolates, Ed25519 signed audit chains, and sub-40ms cold starts optimized for native MCP execution. See our infrastructure
About LangSmith (LLM Observability & Hub) MCP Server
Connect your LangSmith account to any AI agent and take full control of your LLM observability, tracing, and prompt management through natural conversation.
Windsurf's Cascade agent chains multiple LangSmith (LLM Observability & Hub) tool calls autonomously — query data, analyze results, and generate code in a single agentic session. Paste the Vinkius Edge URL, reload, and all 6 tools are immediately available. Real-time tool feedback appears inline, so you see API responses directly in your editor.
What you can do
- Trace Orchestration — List active tracing projects and retrieve detailed execution logs for specific LLM invocation runs directly from your agent
- Performance Telemetry — Extract precise metrics including token consumption, prompt latency, and exact error strings from your AI pipelines
- Prompt Hub Access — Navigate and retrieve managed prompt templates, variable definitions, and version histories hosted in the LangChain Hub
- Evaluation Datasets — Enumerate curated 'golden' datasets used for automated evaluation of prompt logic or few-shot injection models
- Human-in-the-Loop Audit — Monitor active annotation queues where human reviewers assess the alignment, accuracy, and safety of generated LLM traces
- Agentic Step Analysis — Deep-dive into multi-turn agentic workflows to understand nested tool calls and internal reasoning paths securely
The LangSmith (LLM Observability & Hub) MCP Server exposes 6 tools through the Vinkius. Connect it to Windsurf in under two minutes — no API keys to rotate, no infrastructure to provision, no vendor lock-in. Your configuration, your data, your control.
How to Connect LangSmith (LLM Observability & Hub) to Windsurf via MCP
Follow these steps to integrate the LangSmith (LLM Observability & Hub) MCP Server with Windsurf.
Open MCP Settings
Go to Settings → MCP Configuration or press Cmd+Shift+P and search "MCP"
Add the server
Paste the JSON configuration above into mcp_config.json
Save and reload
Windsurf will detect the new server automatically
Start using LangSmith (LLM Observability & Hub)
Open Cascade and ask: "Using LangSmith (LLM Observability & Hub), help me..." — 6 tools available
Why Use Windsurf with the LangSmith (LLM Observability & Hub) MCP Server
Windsurf provides unique advantages when paired with LangSmith (LLM Observability & Hub) through the Model Context Protocol.
Windsurf's Cascade agent autonomously chains multiple tool calls in sequence, solving complex multi-step tasks without manual intervention
Purpose-built for agentic workflows — Cascade understands context across your entire codebase and integrates MCP tools natively
JSON-based configuration means zero code changes: paste a URL, reload, and all 6 tools are immediately available
Real-time tool feedback is displayed inline, so you see API responses directly in your editor without switching contexts
LangSmith (LLM Observability & Hub) + Windsurf Use Cases
Practical scenarios where Windsurf combined with the LangSmith (LLM Observability & Hub) MCP Server delivers measurable value.
Automated code generation: ask Cascade to fetch data from LangSmith (LLM Observability & Hub) and generate models, types, or handlers based on real API responses
Live debugging: query LangSmith (LLM Observability & Hub) tools mid-session to inspect production data while debugging without leaving the editor
Documentation generation: pull schema information from LangSmith (LLM Observability & Hub) and have Cascade generate comprehensive API docs automatically
Rapid prototyping: combine LangSmith (LLM Observability & Hub) data with Cascade's code generation to scaffold entire features in minutes
LangSmith (LLM Observability & Hub) MCP Tools for Windsurf (6)
These 6 tools become available when you connect LangSmith (LLM Observability & Hub) to Windsurf via MCP:
get_run
Get precise telemetry for a single LLM invocation run
list_annotation_queues
List active human-in-the-loop annotation queues
list_datasets
List all evaluation and fine-tuning datasets mapped in LangSmith
list_projects
Maps out the boundaries of distinct AI pipelines currently monitored by LangSmith. List all active LangSmith tracing projects/sessions
list_prompts
Extract prompt templates hosted in the LangChain Hub
list_runs
Isolates the raw interactions containing prompts sent to and responses received from the AI models. List explicit LLM invocation runs within a specific project
Example Prompts for LangSmith (LLM Observability & Hub) in Windsurf
Ready-to-use prompts you can give your Windsurf agent to start working with LangSmith (LLM Observability & Hub) immediately.
"List all active tracing projects in LangSmith"
"Show me the telemetry for the last run in the 'Production-Bot-V2' project"
"List all prompts hosted in our Hub repository"
Troubleshooting LangSmith (LLM Observability & Hub) MCP Server with Windsurf
Common issues when connecting LangSmith (LLM Observability & Hub) to Windsurf through the Vinkius, and how to resolve them.
Server not connecting
LangSmith (LLM Observability & Hub) + Windsurf FAQ
Common questions about integrating LangSmith (LLM Observability & Hub) MCP Server with Windsurf.
How does Windsurf discover MCP tools?
mcp_config.json file on startup and connects to each configured server via Streamable HTTP. Tools are listed in the MCP panel and available to Cascade automatically.Can Cascade chain multiple MCP tool calls?
Does Windsurf support multiple MCP servers?
mcp_config.json. Each server's tools appear in the MCP panel and Cascade can use tools from different servers in a single flow.Connect LangSmith (LLM Observability & Hub) with your favorite client
Step-by-step setup guides for every MCP-compatible client and framework:
Anthropic's native desktop app for Claude with built-in MCP support.
AI-first code editor with integrated LLM-powered coding assistance.
GitHub Copilot in VS Code with Agent mode and MCP support.
Purpose-built IDE for agentic AI coding workflows.
Autonomous AI coding agent that runs inside VS Code.
Anthropic's agentic CLI for terminal-first development.
Python SDK for building production-grade OpenAI agent workflows.
Google's framework for building production AI agents.
Type-safe agent development for Python with first-class MCP support.
TypeScript toolkit for building AI-powered web applications.
TypeScript-native agent framework for modern web stacks.
Python framework for orchestrating collaborative AI agent crews.
Leading Python framework for composable LLM applications.
Data-aware AI agent framework for structured and unstructured sources.
Microsoft's framework for multi-agent collaborative conversations.
Connect LangSmith (LLM Observability & Hub) to Windsurf
Get your token, paste the configuration, and start using 6 tools in under 2 minutes. No API key management needed.
