2,500+ MCP servers ready to use
Vinkius

New Relic AI (LLM Observability) MCP Server for Mastra AI 10 tools — connect in under 2 minutes

Built by Vinkius GDPR 10 Tools SDK

Mastra AI is a TypeScript-native agent framework built for modern web stacks. Connect New Relic AI (LLM Observability) through the Vinkius and Mastra agents discover all tools automatically — type-safe, streaming-ready, and deployable anywhere Node.js runs.

Vinkius supports streamable HTTP and SSE.

typescript
import { Agent } from "@mastra/core/agent";
import { createMCPClient } from "@mastra/mcp";
import { openai } from "@ai-sdk/openai";

async function main() {
  // Your Vinkius token — get it at cloud.vinkius.com
  const mcpClient = await createMCPClient({
    servers: {
      "new-relic-ai-llm-observability": {
        url: "https://edge.vinkius.com/[YOUR_TOKEN_HERE]/mcp",
      },
    },
  });

  const tools = await mcpClient.getTools();
  const agent = new Agent({
    name: "New Relic AI (LLM Observability) Agent",
    instructions:
      "You help users interact with New Relic AI (LLM Observability) " +
      "using 10 tools.",
    model: openai("gpt-4o"),
    tools,
  });

  const result = await agent.generate(
    "What can I do with New Relic AI (LLM Observability)?"
  );
  console.log(result.text);
}

main();
New Relic AI (LLM Observability)
Fully ManagedVinkius Servers
60%Token savings
High SecurityEnterprise-grade
IAMAccess control
EU AI ActCompliant
DLPData protection
V8 IsolateSandboxed
Ed25519Audit chain
<40msKill switch
Stream every event to Splunk, Datadog, or your own webhook in real-time

* Every MCP server runs on Vinkius-managed infrastructure inside AWS - a purpose-built runtime with per-request V8 isolates, Ed25519 signed audit chains, and sub-40ms cold starts optimized for native MCP execution. See our infrastructure

About New Relic AI (LLM Observability) MCP Server

Connect your New Relic AI account to any AI agent and take full control of your LLM observability, token cost tracking, and performance analytics through natural conversation.

Mastra's agent abstraction provides a clean separation between LLM logic and New Relic AI (LLM Observability) tool infrastructure. Connect 10 tools through the Vinkius and use Mastra's built-in workflow engine to chain tool calls with conditional logic, retries, and parallel execution — deployable to any Node.js host in one command.

What you can do

  • LLM Telemetry Audit — Retrieve detailed LLM chat completion messages and prompt inputs directly from your agent to understand literal model behavior in real-time
  • Token Cost Tracking — Execute structural extraction of model costs to calculate exact USD token consumption across your entire AI infrastructure securely
  • Performance Monitoring — Extract p95 latency matrices and average response times to ensure your LLM text generation remains performant and sub-second
  • User Feedback Loop — Retrieve chronological feedback messages and 1-5 rating scores dumped by human supervisors to identify quality regressions natively
  • Custom NRQL Execution — Run sophisticated read-only queries using the New Relic Query Language (NRQL) to extract rich insights from multi-tenant AI datasets instantly
  • Custom Event Injection — Post atomic generic telemetry rows to track internal agent states and custom behavioral markers across your observability pipeline
  • Resource Discovery — Enumerate active APM apps, dashboards, and alert policies to audit your AI environment's structural health and PagerDuty configurations

The New Relic AI (LLM Observability) MCP Server exposes 10 tools through the Vinkius. Connect it to Mastra AI in under two minutes — no API keys to rotate, no infrastructure to provision, no vendor lock-in. Your configuration, your data, your control.

How to Connect New Relic AI (LLM Observability) to Mastra AI via MCP

Follow these steps to integrate the New Relic AI (LLM Observability) MCP Server with Mastra AI.

01

Install dependencies

Run npm install @mastra/core @mastra/mcp @ai-sdk/openai

02

Replace the token

Replace [YOUR_TOKEN_HERE] with your Vinkius token

03

Run the agent

Save to agent.ts and run with npx tsx agent.ts

04

Explore tools

Mastra discovers 10 tools from New Relic AI (LLM Observability) via MCP

Why Use Mastra AI with the New Relic AI (LLM Observability) MCP Server

Mastra AI provides unique advantages when paired with New Relic AI (LLM Observability) through the Model Context Protocol.

01

Mastra's agent abstraction provides a clean separation between LLM logic and tool infrastructure — add New Relic AI (LLM Observability) without touching business code

02

Built-in workflow engine chains MCP tool calls with conditional logic, retries, and parallel execution for complex automation

03

TypeScript-native: full type inference for every New Relic AI (LLM Observability) tool response with IDE autocomplete and compile-time checks

04

One-command deployment to any Node.js host — Vercel, Railway, Fly.io, or your own infrastructure

New Relic AI (LLM Observability) + Mastra AI Use Cases

Practical scenarios where Mastra AI combined with the New Relic AI (LLM Observability) MCP Server delivers measurable value.

01

Automated workflows: build multi-step agents that query New Relic AI (LLM Observability), process results, and trigger downstream actions in a typed pipeline

02

SaaS integrations: embed New Relic AI (LLM Observability) as a first-class tool in your product's AI features with Mastra's clean agent API

03

Background jobs: schedule Mastra agents to query New Relic AI (LLM Observability) on a cron and store results in your database automatically

04

Multi-agent systems: create specialist agents that collaborate using New Relic AI (LLM Observability) tools alongside other MCP servers

New Relic AI (LLM Observability) MCP Tools for Mastra AI (10)

These 10 tools become available when you connect New Relic AI (LLM Observability) to Mastra AI via MCP:

01

custom_nrql

Note that NRQL is read-only. Irreversibly vaporize explicit validations extracting rich Churn flags

02

list_alert_policies

Inspect deep internal arrays mitigating specific Plan Math

03

list_apm_apps

Dispatch an automated validation check routing explicit Gateway history

04

list_dashboards

Identify precise active arrays spanning native Gateway auth

05

post_custom_event

/events` inserting absolute generic `CustomAITelemetry` rows tracking internal agent state. Enumerate explicitly attached structured rules exporting active Billing

06

query_llm_costs

Perform structural extraction of properties driving active Account logic

07

query_llm_errors

Identify precise active arrays spanning native Hold parsing

08

query_llm_events

Identify bounded CRM records inside the Headless New Relic Platform

09

query_llm_feedback

Retrieve explicit Cloud logging tracing explicit Vault limits

10

query_llm_latency

Provision a highly-available JSON Payload generating hard Customer bindings

Example Prompts for New Relic AI (LLM Observability) in Mastra AI

Ready-to-use prompts you can give your Mastra AI agent to start working with New Relic AI (LLM Observability) immediately.

01

"Show me the last 5 LLM events for the 'OpenAI' vendor"

02

"What is my total LLM token cost for the last 24 hours?"

03

"Run NRQL: SELECT count(*) FROM LlmEvent WHERE duration > 2 SINCE 1 hour ago"

Troubleshooting New Relic AI (LLM Observability) MCP Server with Mastra AI

Common issues when connecting New Relic AI (LLM Observability) to Mastra AI through the Vinkius, and how to resolve them.

01

createMCPClient not exported

Install: npm install @mastra/mcp

New Relic AI (LLM Observability) + Mastra AI FAQ

Common questions about integrating New Relic AI (LLM Observability) MCP Server with Mastra AI.

01

How does Mastra AI connect to MCP servers?

Create an MCPClient with the server URL and pass it to your agent. Mastra discovers all tools and makes them available with full TypeScript types.
02

Can Mastra agents use tools from multiple servers?

Yes. Pass multiple MCP clients to the agent constructor. Mastra merges all tool schemas and the agent can call any tool from any server.
03

Does Mastra support workflow orchestration?

Yes. Mastra has a built-in workflow engine that lets you chain MCP tool calls with branching logic, error handling, and parallel execution.

Connect New Relic AI (LLM Observability) to Mastra AI

Get your token, paste the configuration, and start using 10 tools in under 2 minutes. No API key management needed.