2,500+ MCP servers ready to use
Vinkius

LangSmith MCP Server for Mastra AI 3 tools — connect in under 2 minutes

Built by Vinkius GDPR 3 Tools SDK

Mastra AI is a TypeScript-native agent framework built for modern web stacks. Connect LangSmith through the Vinkius and Mastra agents discover all tools automatically — type-safe, streaming-ready, and deployable anywhere Node.js runs.

Vinkius supports streamable HTTP and SSE.

typescript
import { Agent } from "@mastra/core/agent";
import { createMCPClient } from "@mastra/mcp";
import { openai } from "@ai-sdk/openai";

async function main() {
  // Your Vinkius token — get it at cloud.vinkius.com
  const mcpClient = await createMCPClient({
    servers: {
      "langsmith": {
        url: "https://edge.vinkius.com/[YOUR_TOKEN_HERE]/mcp",
      },
    },
  });

  const tools = await mcpClient.getTools();
  const agent = new Agent({
    name: "LangSmith Agent",
    instructions:
      "You help users interact with LangSmith " +
      "using 3 tools.",
    model: openai("gpt-4o"),
    tools,
  });

  const result = await agent.generate(
    "What can I do with LangSmith?"
  );
  console.log(result.text);
}

main();
LangSmith
Fully ManagedVinkius Servers
60%Token savings
High SecurityEnterprise-grade
IAMAccess control
EU AI ActCompliant
DLPData protection
V8 IsolateSandboxed
Ed25519Audit chain
<40msKill switch
Stream every event to Splunk, Datadog, or your own webhook in real-time

* Every MCP server runs on Vinkius-managed infrastructure inside AWS - a purpose-built runtime with per-request V8 isolates, Ed25519 signed audit chains, and sub-40ms cold starts optimized for native MCP execution. See our infrastructure

About LangSmith MCP Server

Connect your AI agent to LangSmith — the observability platform from the LangChain team that gives you complete visibility into your LLM applications.

Mastra's agent abstraction provides a clean separation between LLM logic and LangSmith tool infrastructure. Connect 3 tools through the Vinkius and use Mastra's built-in workflow engine to chain tool calls with conditional logic, retries, and parallel execution — deployable to any Node.js host in one command.

What you can do

  • List Projects — View all tracing projects with aggregate metrics: total runs, median latency, feedback scores, and creation dates
  • List Runs — Browse recent traces in any project. See run names, types (LLM, chain, tool), status (success/error), token usage, and timing
  • Run Details — Deep-dive into any specific run to see its full execution trace, inputs, outputs, and associated feedback

The LangSmith MCP Server exposes 3 tools through the Vinkius. Connect it to Mastra AI in under two minutes — no API keys to rotate, no infrastructure to provision, no vendor lock-in. Your configuration, your data, your control.

How to Connect LangSmith to Mastra AI via MCP

Follow these steps to integrate the LangSmith MCP Server with Mastra AI.

01

Install dependencies

Run npm install @mastra/core @mastra/mcp @ai-sdk/openai

02

Replace the token

Replace [YOUR_TOKEN_HERE] with your Vinkius token

03

Run the agent

Save to agent.ts and run with npx tsx agent.ts

04

Explore tools

Mastra discovers 3 tools from LangSmith via MCP

Why Use Mastra AI with the LangSmith MCP Server

Mastra AI provides unique advantages when paired with LangSmith through the Model Context Protocol.

01

Mastra's agent abstraction provides a clean separation between LLM logic and tool infrastructure — add LangSmith without touching business code

02

Built-in workflow engine chains MCP tool calls with conditional logic, retries, and parallel execution for complex automation

03

TypeScript-native: full type inference for every LangSmith tool response with IDE autocomplete and compile-time checks

04

One-command deployment to any Node.js host — Vercel, Railway, Fly.io, or your own infrastructure

LangSmith + Mastra AI Use Cases

Practical scenarios where Mastra AI combined with the LangSmith MCP Server delivers measurable value.

01

Automated workflows: build multi-step agents that query LangSmith, process results, and trigger downstream actions in a typed pipeline

02

SaaS integrations: embed LangSmith as a first-class tool in your product's AI features with Mastra's clean agent API

03

Background jobs: schedule Mastra agents to query LangSmith on a cron and store results in your database automatically

04

Multi-agent systems: create specialist agents that collaborate using LangSmith tools alongside other MCP servers

LangSmith MCP Tools for Mastra AI (3)

These 3 tools become available when you connect LangSmith to Mastra AI via MCP:

01

langsmith_get_run

Useful for debugging specific LLM calls or agent actions. Get detailed information about a specific run/trace by its ID

02

langsmith_list_projects

Each project groups related traces together and shows aggregate metrics like total runs, median latency, and feedback counts. List all tracing projects in your LangSmith account with run counts, latency stats, and feedback metrics

03

langsmith_list_runs

Each run represents a single LLM call, chain execution, or agent action. Shows status (success/error), latency, and token consumption. List recent traces/runs in a specific LangSmith project. Shows run names, types, status, token usage, and timing

Example Prompts for LangSmith in Mastra AI

Ready-to-use prompts you can give your Mastra AI agent to start working with LangSmith immediately.

01

"List all my LangSmith projects and show their metrics."

02

"Show me the last 5 runs in my production-agent project."

03

"Get details on the failed run a0b1c2."

Troubleshooting LangSmith MCP Server with Mastra AI

Common issues when connecting LangSmith to Mastra AI through the Vinkius, and how to resolve them.

01

createMCPClient not exported

Install: npm install @mastra/mcp

LangSmith + Mastra AI FAQ

Common questions about integrating LangSmith MCP Server with Mastra AI.

01

How does Mastra AI connect to MCP servers?

Create an MCPClient with the server URL and pass it to your agent. Mastra discovers all tools and makes them available with full TypeScript types.
02

Can Mastra agents use tools from multiple servers?

Yes. Pass multiple MCP clients to the agent constructor. Mastra merges all tool schemas and the agent can call any tool from any server.
03

Does Mastra support workflow orchestration?

Yes. Mastra has a built-in workflow engine that lets you chain MCP tool calls with branching logic, error handling, and parallel execution.

Connect LangSmith to Mastra AI

Get your token, paste the configuration, and start using 3 tools in under 2 minutes. No API key management needed.