Bring Shared Inbox
to Mastra AI
Learn how to connect Front to Mastra AI and start using 12 AI agent tools in minutes. Fully managed, enterprise secure, and ready to use without writing a single line of code.
What is the Front MCP Server?
Connect your Front account to any AI agent and take full control of your team's customer communication and shared inbox workflows through natural conversation.
What you can do
- Conversation Orchestration — List and manage customer conversations programmatically, including updating statuses (open, archived, spam) and assigning teammates
- Message Intelligence — Retrieve complete message histories and metadata for any conversation to perform deep analysis and sentiment tracking
- Omnichannel Support — Monitor multiple communication streams including Email, Chat, and SMS from a single unified AI interface
- Team Collaboration — Manage team contacts and retrieve teammate profiles to coordinate internal routing and workload distribution
- Operational Visibility — Get a comprehensive overview of shared inboxes and active channels using natural language commands
How it works
1. Subscribe to this server
2. Retrieve your API Token from Front (Settings > Developers > API Tokens)
3. Start managing your unified inbox from Claude, Cursor, or any MCP client
No more manual toggling between different communication channels. Your AI acts as your dedicated support and communication coordinator.
Who is this for?
- Support Teams — instantly triage high-volume inboxes and assign urgent conversations using natural language
- Customer Success Managers — retrieve full interaction histories and update contact details without leaving your workspace
- Operations Leads — monitor channel activity and manage team assignments across shared mailboxes
Built-in capabilities (12)
Check connection
Get contact details
Get conversation info
Read message details
). List communication channels
Get message history
List team conversations
List team inboxes
List your contacts
Send a message
Find conversations
Modify conversation
Why Mastra AI?
Mastra's agent abstraction provides a clean separation between LLM logic and Front tool infrastructure. Connect 12 tools through Vinkius and use Mastra's built-in workflow engine to chain tool calls with conditional logic, retries, and parallel execution. deployable to any Node.js host in one command.
- —
Mastra's agent abstraction provides a clean separation between LLM logic and tool infrastructure. add Front without touching business code
- —
Built-in workflow engine chains MCP tool calls with conditional logic, retries, and parallel execution for complex automation
- —
TypeScript-native: full type inference for every Front tool response with IDE autocomplete and compile-time checks
- —
One-command deployment to any Node.js host. Vercel, Railway, Fly.io, or your own infrastructure
Front in Mastra AI
Front and 3,400+ other MCP servers. One platform. One governance layer.
Teams that connect Front to Mastra AI through Vinkius don't need to source, host, or maintain individual MCP servers. Every tool call runs inside a hardened runtime with credential isolation, DLP, and a signed audit chain.
Raw MCP | Vinkius | |
|---|---|---|
| Server catalog | Find and host yourself | 3,400+ managed |
| Infrastructure | Self-hosted | Sandboxed V8 isolates |
| Credential handling | Plaintext in config | Vault + runtime injection |
| Data loss prevention | None | Configurable DLP policies |
| Kill switch | None | Global instant shutdown |
| Financial circuit breakers | None | Per-server limits + alerts |
| Audit trail | None | Ed25519 signed logs |
| SIEM log streaming | None | Splunk, Datadog, Webhook |
| Honeytokens | None | Canary alerts on leak |
| Custom domains | Not applicable | DNS challenge verified |
| GDPR compliance | Manual effort | Automated purge + export |
Why teams choose Vinkius for Front in Mastra AI
The Front MCP Server runs on Vinkius-managed infrastructure inside AWS — a purpose-built runtime with per-request V8 isolates, Ed25519 signed audit chains, and sub-40ms cold starts. All 12 tools execute in hardened sandboxes optimized for native MCP execution.
Your AI agents in Mastra AI only access the data you authorize, with DLP that blocks sensitive information from ever reaching the model, kill switch for instant shutdown, and up to 60% token savings. Enterprise-grade infrastructure, zero maintenance.

* Every MCP server runs on Vinkius-managed infrastructure inside AWS - a purpose-built runtime with per-request V8 isolates, Ed25519 signed audit chains, and sub-40ms cold starts optimized for native MCP execution. See our infrastructure
How Vinkius secures
Front for Mastra AI
Every tool call from Mastra AI to the Front MCP Server is protected by DLP redaction, cryptographic audit chains, V8 sandbox isolation, kill switch, and financial circuit breakers.
Frequently asked questions
How do I create an API Token in Front?
Log in to Front, go to Settings > Developers > API Tokens, and click Create Token. Ensure you select the appropriate scopes for your needs.
Can I search for conversations by status?
Yes! The list_conversations tool allows you to filter results by statuses like 'open', 'archived', 'deleted', or 'spam'.
How do I send a reply via AI?
Use the reply_to_conversation tool by providing the conversationId and your message body. You can also specify an author_id to send as a specific teammate.
How does Mastra AI connect to MCP servers?
Create an MCPClient with the server URL and pass it to your agent. Mastra discovers all tools and makes them available with full TypeScript types.
Can Mastra agents use tools from multiple servers?
Yes. Pass multiple MCP clients to the agent constructor. Mastra merges all tool schemas and the agent can call any tool from any server.
Does Mastra support workflow orchestration?
Yes. Mastra has a built-in workflow engine that lets you chain MCP tool calls with branching logic, error handling, and parallel execution.
createMCPClient not exported
Install: npm install @mastra/mcp
