Front MCP Server for Mastra AI 12 tools — connect in under 2 minutes
Mastra AI is a TypeScript-native agent framework built for modern web stacks. Connect Front through Vinkius and Mastra agents discover all tools automatically. type-safe, streaming-ready, and deployable anywhere Node.js runs.
ASK AI ABOUT THIS MCP SERVER
Vinkius supports streamable HTTP and SSE.
import { Agent } from "@mastra/core/agent";
import { createMCPClient } from "@mastra/mcp";
import { openai } from "@ai-sdk/openai";
async function main() {
// Your Vinkius token. get it at cloud.vinkius.com
const mcpClient = await createMCPClient({
servers: {
"front": {
url: "https://edge.vinkius.com/[YOUR_TOKEN_HERE]/mcp",
},
},
});
const tools = await mcpClient.getTools();
const agent = new Agent({
name: "Front Agent",
instructions:
"You help users interact with Front " +
"using 12 tools.",
model: openai("gpt-4o"),
tools,
});
const result = await agent.generate(
"What can I do with Front?"
);
console.log(result.text);
}
main();
* Every MCP server runs on Vinkius-managed infrastructure inside AWS - a purpose-built runtime with per-request V8 isolates, Ed25519 signed audit chains, and sub-40ms cold starts optimized for native MCP execution. See our infrastructure
About Front MCP Server
Connect your Front account to any AI agent to automate your customer communication and shared inbox workflows through the Model Context Protocol (MCP). Front is a customer operations platform that enables teams to manage shared emails, SMS, and chats collaboratively. This MCP server enables you to track active conversations, assign messages, and fetch thread histories directly through natural conversation.
Mastra's agent abstraction provides a clean separation between LLM logic and Front tool infrastructure. Connect 12 tools through Vinkius and use Mastra's built-in workflow engine to chain tool calls with conditional logic, retries, and parallel execution. deployable to any Node.js host in one command.
Key Features
- Shared Inbox Management — List all accessible shared inboxes and retrieve the specific conversations routed to them.
- Conversation Tracking — Search and list all customer conversations, checking their current status (open, archived) and assigned owners.
- Message Threading — Fetch the complete message history for any specific conversation to maintain context before replying.
- Collaborative Replies — Draft and send replies to active conversations directly from your chat interface on behalf of a teammate.
- Status Automation — Programmatically update conversation statuses (e.g., archiving resolved issues) to keep inboxes clean.
- Team & Contact Discovery — List all workspace teammates and customer contacts to ensure accurate routing and messaging.
The Front MCP Server exposes 12 tools through the Vinkius. Connect it to Mastra AI in under two minutes — no API keys to rotate, no infrastructure to provision, no vendor lock-in. Your configuration, your data, your control.
How to Connect Front to Mastra AI via MCP
Follow these steps to integrate the Front MCP Server with Mastra AI.
Install dependencies
Run npm install @mastra/core @mastra/mcp @ai-sdk/openai
Replace the token
Replace [YOUR_TOKEN_HERE] with your Vinkius token
Run the agent
Save to agent.ts and run with npx tsx agent.ts
Explore tools
Mastra discovers 12 tools from Front via MCP
Why Use Mastra AI with the Front MCP Server
Mastra AI provides unique advantages when paired with Front through the Model Context Protocol.
Mastra's agent abstraction provides a clean separation between LLM logic and tool infrastructure. add Front without touching business code
Built-in workflow engine chains MCP tool calls with conditional logic, retries, and parallel execution for complex automation
TypeScript-native: full type inference for every Front tool response with IDE autocomplete and compile-time checks
One-command deployment to any Node.js host. Vercel, Railway, Fly.io, or your own infrastructure
Front + Mastra AI Use Cases
Practical scenarios where Mastra AI combined with the Front MCP Server delivers measurable value.
Automated workflows: build multi-step agents that query Front, process results, and trigger downstream actions in a typed pipeline
SaaS integrations: embed Front as a first-class tool in your product's AI features with Mastra's clean agent API
Background jobs: schedule Mastra agents to query Front on a cron and store results in your database automatically
Multi-agent systems: create specialist agents that collaborate using Front tools alongside other MCP servers
Front MCP Tools for Mastra AI (12)
These 12 tools become available when you connect Front to Mastra AI via MCP:
get_conversation_details
Get conversation metadata
get_inbox_details
Get inbox metadata
list_address_book
List contacts
list_all_conversations
List all conversations
list_conversation_messages
List thread messages
list_inbox_teammates
List Front teammates
list_inbox_threads
List inbox conversations
list_shared_inboxes
List shared inboxes
search_conversations
g. "inbox:inb_123 is:open"). Search all conversations
send_inbox_reply
Send a reply
update_conversation_status
g., archived, open) or assignee of a conversation. Update conversation
verify_api_status
Verify connection
Example Prompts for Front in Mastra AI
Ready-to-use prompts you can give your Mastra AI agent to start working with Front immediately.
"List all shared inboxes in my Front account."
"Search for open conversations in the Support inbox."
"Archive conversation 'cnv_987'."
Troubleshooting Front MCP Server with Mastra AI
Common issues when connecting Front to Mastra AI through the Vinkius, and how to resolve them.
createMCPClient not exported
npm install @mastra/mcpFront + Mastra AI FAQ
Common questions about integrating Front MCP Server with Mastra AI.
How does Mastra AI connect to MCP servers?
MCPClient with the server URL and pass it to your agent. Mastra discovers all tools and makes them available with full TypeScript types.Can Mastra agents use tools from multiple servers?
Does Mastra support workflow orchestration?
Connect Front with your favorite client
Step-by-step setup guides for every MCP-compatible client and framework:
Anthropic's native desktop app for Claude with built-in MCP support.
AI-first code editor with integrated LLM-powered coding assistance.
GitHub Copilot in VS Code with Agent mode and MCP support.
Purpose-built IDE for agentic AI coding workflows.
Autonomous AI coding agent that runs inside VS Code.
Anthropic's agentic CLI for terminal-first development.
Python SDK for building production-grade OpenAI agent workflows.
Google's framework for building production AI agents.
Type-safe agent development for Python with first-class MCP support.
TypeScript toolkit for building AI-powered web applications.
TypeScript-native agent framework for modern web stacks.
Python framework for orchestrating collaborative AI agent crews.
Leading Python framework for composable LLM applications.
Data-aware AI agent framework for structured and unstructured sources.
Microsoft's framework for multi-agent collaborative conversations.
Connect Front to Mastra AI
Get your token, paste the configuration, and start using 12 tools in under 2 minutes. No API key management needed.
