Bring Llm Observability
to Mastra AI
Learn how to connect Keywords AI to Mastra AI and start using 11 AI agent tools in minutes. Fully managed, enterprise secure, and ready to use without writing a single line of code.
What is the Keywords AI MCP Server?
Connect your Keywords AI account to any AI agent and monitor LLM performance.
What you can do
- Request Logs — List and filter all LLM API calls by model
- Cost Tracking — Monitor credit balance and usage statistics
- Analytics — View cost trends, latency metrics, and error rates
- Model Catalog — Browse available LLM models
- Team Management — List users and view activity
- Alerts — Review monitoring thresholds
Built-in capabilities (11)
Verify API connectivity
Get analytics dashboard
Get credit balance
Get request details
Get usage statistics
Get user details
List monitoring alerts
List available models
List API request logs
List requests by model
List team users
Why Mastra AI?
Mastra's agent abstraction provides a clean separation between LLM logic and Keywords AI tool infrastructure. Connect 11 tools through Vinkius and use Mastra's built-in workflow engine to chain tool calls with conditional logic, retries, and parallel execution. deployable to any Node.js host in one command.
- —
Mastra's agent abstraction provides a clean separation between LLM logic and tool infrastructure. add Keywords AI without touching business code
- —
Built-in workflow engine chains MCP tool calls with conditional logic, retries, and parallel execution for complex automation
- —
TypeScript-native: full type inference for every Keywords AI tool response with IDE autocomplete and compile-time checks
- —
One-command deployment to any Node.js host. Vercel, Railway, Fly.io, or your own infrastructure
Keywords AI in Mastra AI
Keywords AI and 3,400+ other MCP servers. One platform. One governance layer.
Teams that connect Keywords AI to Mastra AI through Vinkius don't need to source, host, or maintain individual MCP servers. Every tool call runs inside a hardened runtime with credential isolation, DLP, and a signed audit chain.
Raw MCP | Vinkius | |
|---|---|---|
| Server catalog | Find and host yourself | 3,400+ managed |
| Infrastructure | Self-hosted | Sandboxed V8 isolates |
| Credential handling | Plaintext in config | Vault + runtime injection |
| Data loss prevention | None | Configurable DLP policies |
| Kill switch | None | Global instant shutdown |
| Financial circuit breakers | None | Per-server limits + alerts |
| Audit trail | None | Ed25519 signed logs |
| SIEM log streaming | None | Splunk, Datadog, Webhook |
| Honeytokens | None | Canary alerts on leak |
| Custom domains | Not applicable | DNS challenge verified |
| GDPR compliance | Manual effort | Automated purge + export |
Why teams choose Vinkius for Keywords AI in Mastra AI
The Keywords AI MCP Server runs on Vinkius-managed infrastructure inside AWS — a purpose-built runtime with per-request V8 isolates, Ed25519 signed audit chains, and sub-40ms cold starts. All 11 tools execute in hardened sandboxes optimized for native MCP execution.
Your AI agents in Mastra AI only access the data you authorize, with DLP that blocks sensitive information from ever reaching the model, kill switch for instant shutdown, and up to 60% token savings. Enterprise-grade infrastructure, zero maintenance.

* Every MCP server runs on Vinkius-managed infrastructure inside AWS - a purpose-built runtime with per-request V8 isolates, Ed25519 signed audit chains, and sub-40ms cold starts optimized for native MCP execution. See our infrastructure
How Vinkius secures
Keywords AI for Mastra AI
Every tool call from Mastra AI to the Keywords AI MCP Server is protected by DLP redaction, cryptographic audit chains, V8 sandbox isolation, kill switch, and financial circuit breakers.
Frequently asked questions
Can my AI track LLM costs?
Yes. get_credits shows your balance, get_usage_stats breaks down costs by model and time period.
Can I filter request logs by model?
Yes. list_requests_by_model returns only requests made to a specific LLM.
What analytics are available?
get_analytics provides cost trends, latency percentiles, error rates, and token usage over time.
How does Mastra AI connect to MCP servers?
Create an MCPClient with the server URL and pass it to your agent. Mastra discovers all tools and makes them available with full TypeScript types.
Can Mastra agents use tools from multiple servers?
Yes. Pass multiple MCP clients to the agent constructor. Mastra merges all tool schemas and the agent can call any tool from any server.
Does Mastra support workflow orchestration?
Yes. Mastra has a built-in workflow engine that lets you chain MCP tool calls with branching logic, error handling, and parallel execution.
createMCPClient not exported
Install: npm install @mastra/mcp
