Bring Ml Observability
to Mastra AI
Learn how to connect Arize AI to Mastra AI and start using 6 AI agent tools in minutes. Fully managed, enterprise secure, and ready to use without writing a single line of code.
What is the Arize AI MCP Server?
Connect your Arize AI account to any AI agent and take full control of your machine learning observability and automated model monitoring workflows through natural conversation.
What you can do
- Project & Trace Orchestration — List and monitor active ML tracing projects programmatically, retrieving detailed high-fidelity execution spans and telemetry data in real-time
- Dataset Lifecycle Management — Programmatically create and manage datasets for model evaluation and validation to maintain a perfectly coordinated ML infrastructure
- Experiment Monitoring — Access and track ML experiments to understand high-fidelity model performance, drift, and data quality across different environments
- Model Intelligence Discovery — Retrieve detailed metadata for specific ML models to coordinate your organizational AI strategy directly through your agent
- Operational Monitoring — Access account-level settings and verify API connectivity directly through your agent for instant performance reporting
How it works
1. Subscribe to this server
2. Retrieve your API Key from your Arize dashboard (Settings > API)
3. Start orchestrating your ML observability pipeline from Claude, Cursor, or any MCP client
No more manual logging into observability portals to check model drift or trace spans. Your AI acts as your dedicated ML engineer and observability coordinator.
Who is this for?
- ML Engineers — instantly retrieve span details and analyze model traces using natural language commands
- Data Scientists — monitor experiment results and manage datasets for validation without leaving your creative workspace
- AI Developers — automate the oversight of LLM and ML model health through simple AI queries
Built-in capabilities (6)
Create a dataset
Get model details
List datasets
List experiments
List projects
List spans
Why Mastra AI?
Mastra's agent abstraction provides a clean separation between LLM logic and Arize AI tool infrastructure. Connect 6 tools through Vinkius and use Mastra's built-in workflow engine to chain tool calls with conditional logic, retries, and parallel execution. deployable to any Node.js host in one command.
- —
Mastra's agent abstraction provides a clean separation between LLM logic and tool infrastructure. add Arize AI without touching business code
- —
Built-in workflow engine chains MCP tool calls with conditional logic, retries, and parallel execution for complex automation
- —
TypeScript-native: full type inference for every Arize AI tool response with IDE autocomplete and compile-time checks
- —
One-command deployment to any Node.js host. Vercel, Railway, Fly.io, or your own infrastructure
Arize AI in Mastra AI
Arize AI and 3,400+ other MCP servers. One platform. One governance layer.
Teams that connect Arize AI to Mastra AI through Vinkius don't need to source, host, or maintain individual MCP servers. Every tool call runs inside a hardened runtime with credential isolation, DLP, and a signed audit chain.
Raw MCP | Vinkius | |
|---|---|---|
| Server catalog | Find and host yourself | 3,400+ managed |
| Infrastructure | Self-hosted | Sandboxed V8 isolates |
| Credential handling | Plaintext in config | Vault + runtime injection |
| Data loss prevention | None | Configurable DLP policies |
| Kill switch | None | Global instant shutdown |
| Financial circuit breakers | None | Per-server limits + alerts |
| Audit trail | None | Ed25519 signed logs |
| SIEM log streaming | None | Splunk, Datadog, Webhook |
| Honeytokens | None | Canary alerts on leak |
| Custom domains | Not applicable | DNS challenge verified |
| GDPR compliance | Manual effort | Automated purge + export |
Why teams choose Vinkius for Arize AI in Mastra AI
The Arize AI MCP Server runs on Vinkius-managed infrastructure inside AWS — a purpose-built runtime with per-request V8 isolates, Ed25519 signed audit chains, and sub-40ms cold starts. All 6 tools execute in hardened sandboxes optimized for native MCP execution.
Your AI agents in Mastra AI only access the data you authorize, with DLP that blocks sensitive information from ever reaching the model, kill switch for instant shutdown, and up to 60% token savings. Enterprise-grade infrastructure, zero maintenance.

* Every MCP server runs on Vinkius-managed infrastructure inside AWS - a purpose-built runtime with per-request V8 isolates, Ed25519 signed audit chains, and sub-40ms cold starts optimized for native MCP execution. See our infrastructure
How Vinkius secures
Arize AI for Mastra AI
Every tool call from Mastra AI to the Arize AI MCP Server is protected by DLP redaction, cryptographic audit chains, V8 sandbox isolation, kill switch, and financial circuit breakers.
Frequently asked questions
How do I find my Arize API Key?
Log in to your account, navigate to Settings > API, and generate or copy your unique secret key.
Can I track model drift via AI?
Yes! Use the list_experiments tool to retrieve data on active model evaluations and track performance variations programmatically.
How do I retrieve telemetry traces?
Use the list_spans tool to retrieve high-fidelity execution spans and traces for your ML projects directly from the platform.
How does Mastra AI connect to MCP servers?
Create an MCPClient with the server URL and pass it to your agent. Mastra discovers all tools and makes them available with full TypeScript types.
Can Mastra agents use tools from multiple servers?
Yes. Pass multiple MCP clients to the agent constructor. Mastra merges all tool schemas and the agent can call any tool from any server.
Does Mastra support workflow orchestration?
Yes. Mastra has a built-in workflow engine that lets you chain MCP tool calls with branching logic, error handling, and parallel execution.
createMCPClient not exported
Install: npm install @mastra/mcp
