Bring Cloud Telephony
to LlamaIndex
Learn how to connect TextGrid to LlamaIndex and start using 12 AI agent tools in minutes. Fully managed, enterprise secure, and ready to use without writing a single line of code.
What is the TextGrid MCP Server?
Connect your TextGrid account to any AI agent and simplify how you manage your cloud telephony, global messaging, and communication logs through natural conversation.
What you can do
- Global Messaging — Send instant SMS and MMS messages to recipients worldwide with full status tracking.
- Voice Call Control — Initiate outbound voice calls and retrieve a complete history of inbound and outbound communication.
- Phone Number Management — List your active numbers and search for new available phone numbers across different countries.
- Usage & Cost Tracking — Retrieve detailed usage statistics and records to monitor your communication budget.
- Account Oversight — Fetch account metadata, API keys, and physical addresses for regulatory compliance.
- Operational Monitoring — Check API health and verify connectivity directly from the agent.
How it works
1. Subscribe to this server
2. Enter your TextGrid Account SID and Auth Token (found in your account dashboard)
3. Start managing your cloud communications from Claude, Cursor, or any MCP client
Who is this for?
- Developers & DevOps — quickly test SMS delivery and initiate voice calls via simple AI commands.
- Customer Support Teams — retrieve message histories and verify delivery statuses directly from the workspace.
- Operations Managers — monitor usage costs and manage phone number inventory via the AI assistant.
Built-in capabilities (12)
Verify Textgrid API status
Get account profile and balance
Get details for a specific message
Retrieve usage and cost records
Start a new voice call
List addresses for regulatory compliance
List your active phone numbers
List account API keys
List sent and received SMS messages
List voice call history
Search for new phone numbers
Send a new SMS or MMS
Why LlamaIndex?
LlamaIndex agents combine TextGrid tool responses with indexed documents for comprehensive, grounded answers. Connect 12 tools through Vinkius and query live data alongside vector stores and SQL databases in a single turn. ideal for hybrid search, data enrichment, and analytical workflows.
- —
Data-first architecture: LlamaIndex agents combine TextGrid tool responses with indexed documents for comprehensive, grounded answers
- —
Query pipeline framework lets you chain TextGrid tool calls with transformations, filters, and re-rankers in a typed pipeline
- —
Multi-source reasoning: agents can query TextGrid, a vector store, and a SQL database in a single turn and synthesize results
- —
Observability integrations show exactly what TextGrid tools were called, what data was returned, and how it influenced the final answer
TextGrid in LlamaIndex
TextGrid and 3,400+ other MCP servers. One platform. One governance layer.
Teams that connect TextGrid to LlamaIndex through Vinkius don't need to source, host, or maintain individual MCP servers. Every tool call runs inside a hardened runtime with credential isolation, DLP, and a signed audit chain.
Raw MCP | Vinkius | |
|---|---|---|
| Server catalog | Find and host yourself | 3,400+ managed |
| Infrastructure | Self-hosted | Sandboxed V8 isolates |
| Credential handling | Plaintext in config | Vault + runtime injection |
| Data loss prevention | None | Configurable DLP policies |
| Kill switch | None | Global instant shutdown |
| Financial circuit breakers | None | Per-server limits + alerts |
| Audit trail | None | Ed25519 signed logs |
| SIEM log streaming | None | Splunk, Datadog, Webhook |
| Honeytokens | None | Canary alerts on leak |
| Custom domains | Not applicable | DNS challenge verified |
| GDPR compliance | Manual effort | Automated purge + export |
Why teams choose Vinkius for TextGrid in LlamaIndex
The TextGrid MCP Server runs on Vinkius-managed infrastructure inside AWS — a purpose-built runtime with per-request V8 isolates, Ed25519 signed audit chains, and sub-40ms cold starts. All 12 tools execute in hardened sandboxes optimized for native MCP execution.
Your AI agents in LlamaIndex only access the data you authorize, with DLP that blocks sensitive information from ever reaching the model, kill switch for instant shutdown, and up to 60% token savings. Enterprise-grade infrastructure, zero maintenance.

* Every MCP server runs on Vinkius-managed infrastructure inside AWS - a purpose-built runtime with per-request V8 isolates, Ed25519 signed audit chains, and sub-40ms cold starts optimized for native MCP execution. See our infrastructure
How Vinkius secures
TextGrid for LlamaIndex
Every tool call from LlamaIndex to the TextGrid MCP Server is protected by DLP redaction, cryptographic audit chains, V8 sandbox isolation, kill switch, and financial circuit breakers.
Frequently asked questions
Can I check the delivery status of a specific message via AI?
Yes! Use the get_message_details tool and provide the Message SID. Your agent will retrieve the real-time delivery status (e.g., sent, delivered, failed) and other metadata.
How do I find a new available phone number in the US?
Run the search_available_numbers query and set the countryCode parameter to 'US'. The agent will return a list of phone numbers available for purchase in that region.
Is it possible to see the costs associated with my messaging usage?
Absolutely. Use the get_usage_statistics tool to retrieve detailed usage records and associated costs for your SMS and voice services directly from TextGrid.
How does LlamaIndex connect to MCP servers?
Use the MCP client adapter to create a connection. LlamaIndex discovers all tools and wraps them as query engine tools compatible with any LlamaIndex agent.
Can I combine MCP tools with vector stores?
Yes. LlamaIndex agents can query TextGrid tools and vector store indexes in the same turn, combining real-time and embedded data for grounded responses.
Does LlamaIndex support async MCP calls?
Yes. LlamaIndex's async agent framework supports concurrent MCP tool calls for high-throughput data processing pipelines.
BasicMCPClient not found
Install: pip install llama-index-tools-mcp
