2,500+ MCP servers ready to use
Vinkius

LangSmith MCP Server for Vercel AI SDK 3 tools — connect in under 2 minutes

Built by Vinkius GDPR 3 Tools SDK

The Vercel AI SDK is the TypeScript toolkit for building AI-powered applications. Connect LangSmith through the Vinkius and every tool is available as a typed function — ready for React Server Components, API routes, or any Node.js backend.

Vinkius supports streamable HTTP and SSE.

typescript
import { createMCPClient } from "@ai-sdk/mcp";
import { generateText } from "ai";
import { openai } from "@ai-sdk/openai";

async function main() {
  const mcpClient = await createMCPClient({
    transport: {
      type: "http",
      // Your Vinkius token — get it at cloud.vinkius.com
      url: "https://edge.vinkius.com/[YOUR_TOKEN_HERE]/mcp",
    },
  });

  try {
    const tools = await mcpClient.tools();
    const { text } = await generateText({
      model: openai("gpt-4o"),
      tools,
      prompt: "Using LangSmith, list all available capabilities.",
    });
    console.log(text);
  } finally {
    await mcpClient.close();
  }
}

main();
LangSmith
Fully ManagedVinkius Servers
60%Token savings
High SecurityEnterprise-grade
IAMAccess control
EU AI ActCompliant
DLPData protection
V8 IsolateSandboxed
Ed25519Audit chain
<40msKill switch
Stream every event to Splunk, Datadog, or your own webhook in real-time

* Every MCP server runs on Vinkius-managed infrastructure inside AWS - a purpose-built runtime with per-request V8 isolates, Ed25519 signed audit chains, and sub-40ms cold starts optimized for native MCP execution. See our infrastructure

About LangSmith MCP Server

Connect your AI agent to LangSmith — the observability platform from the LangChain team that gives you complete visibility into your LLM applications.

The Vercel AI SDK gives every LangSmith tool full TypeScript type inference, IDE autocomplete, and compile-time error checking. Connect 3 tools through the Vinkius and stream results progressively to React, Svelte, or Vue components — works on Edge Functions, Cloudflare Workers, and any Node.js runtime.

What you can do

  • List Projects — View all tracing projects with aggregate metrics: total runs, median latency, feedback scores, and creation dates
  • List Runs — Browse recent traces in any project. See run names, types (LLM, chain, tool), status (success/error), token usage, and timing
  • Run Details — Deep-dive into any specific run to see its full execution trace, inputs, outputs, and associated feedback

The LangSmith MCP Server exposes 3 tools through the Vinkius. Connect it to Vercel AI SDK in under two minutes — no API keys to rotate, no infrastructure to provision, no vendor lock-in. Your configuration, your data, your control.

How to Connect LangSmith to Vercel AI SDK via MCP

Follow these steps to integrate the LangSmith MCP Server with Vercel AI SDK.

01

Install dependencies

Run npm install @ai-sdk/mcp ai @ai-sdk/openai

02

Replace the token

Replace [YOUR_TOKEN_HERE] with your Vinkius token

03

Run the script

Save to agent.ts and run with npx tsx agent.ts

04

Explore tools

The SDK discovers 3 tools from LangSmith and passes them to the LLM

Why Use Vercel AI SDK with the LangSmith MCP Server

Vercel AI SDK provides unique advantages when paired with LangSmith through the Model Context Protocol.

01

TypeScript-first: every MCP tool gets full type inference, IDE autocomplete, and compile-time error checking out of the box

02

Framework-agnostic core works with Next.js, Nuxt, SvelteKit, or any Node.js runtime — same LangSmith integration everywhere

03

Built-in streaming UI primitives let you display LangSmith tool results progressively in React, Svelte, or Vue components

04

Edge-compatible: the AI SDK runs on Vercel Edge Functions, Cloudflare Workers, and other edge runtimes for minimal latency

LangSmith + Vercel AI SDK Use Cases

Practical scenarios where Vercel AI SDK combined with the LangSmith MCP Server delivers measurable value.

01

AI-powered web apps: build dashboards that query LangSmith in real-time and stream results to the UI with zero loading states

02

API backends: create serverless endpoints that orchestrate LangSmith tools and return structured JSON responses to any frontend

03

Chatbots with tool use: embed LangSmith capabilities into conversational interfaces with streaming responses and tool call visibility

04

Internal tools: build admin panels where team members interact with LangSmith through natural language queries

LangSmith MCP Tools for Vercel AI SDK (3)

These 3 tools become available when you connect LangSmith to Vercel AI SDK via MCP:

01

langsmith_get_run

Useful for debugging specific LLM calls or agent actions. Get detailed information about a specific run/trace by its ID

02

langsmith_list_projects

Each project groups related traces together and shows aggregate metrics like total runs, median latency, and feedback counts. List all tracing projects in your LangSmith account with run counts, latency stats, and feedback metrics

03

langsmith_list_runs

Each run represents a single LLM call, chain execution, or agent action. Shows status (success/error), latency, and token consumption. List recent traces/runs in a specific LangSmith project. Shows run names, types, status, token usage, and timing

Example Prompts for LangSmith in Vercel AI SDK

Ready-to-use prompts you can give your Vercel AI SDK agent to start working with LangSmith immediately.

01

"List all my LangSmith projects and show their metrics."

02

"Show me the last 5 runs in my production-agent project."

03

"Get details on the failed run a0b1c2."

Troubleshooting LangSmith MCP Server with Vercel AI SDK

Common issues when connecting LangSmith to Vercel AI SDK through the Vinkius, and how to resolve them.

01

createMCPClient is not a function

Install: npm install @ai-sdk/mcp

LangSmith + Vercel AI SDK FAQ

Common questions about integrating LangSmith MCP Server with Vercel AI SDK.

01

How does the Vercel AI SDK connect to MCP servers?

Import createMCPClient from @ai-sdk/mcp and pass the server URL. The SDK discovers all tools and provides typed TypeScript interfaces for each one.
02

Can I use MCP tools in Edge Functions?

Yes. The AI SDK is fully edge-compatible. MCP connections work on Vercel Edge Functions, Cloudflare Workers, and similar runtimes.
03

Does it support streaming tool results?

Yes. The SDK provides streaming primitives like useChat and streamText that handle tool calls and display results progressively in the UI.

Connect LangSmith to Vercel AI SDK

Get your token, paste the configuration, and start using 3 tools in under 2 minutes. No API key management needed.