3,400+ MCP servers ready to use
Vinkius
L

Bring Conversational Ai
to LlamaIndex

Learn how to connect Voiceflow to LlamaIndex and start using 12 AI agent tools in minutes. Fully managed, enterprise secure, and ready to use without writing a single line of code.

Delete StateGet FeedbackGet ProjectGet StateGet TranscriptInteractList Kb DocsList Kb TagsList ProjectsList TranscriptsQuery KbSave State

What is the Voiceflow MCP Server?

Connect your Voiceflow account to any AI agent and simplify how you build, test, and monitor your conversational assistants through natural language conversation.

What you can do

  • Agent Interaction — Send messages and trigger actions in your Voiceflow agents to test responses and flows instantly.
  • Knowledge Base (RAG) Control — Query your agent's KB directly for answers and list uploaded documents and tags.
  • State Management — Retrieve, update, or reset user conversation states and variables to debug complex logic.
  • Transcript Analysis — List and fetch full conversation logs for any project to monitor user interactions.
  • Operational Monitoring — Retrieve user feedback (upvotes/downvotes) and monitor project configurations in real-time.

How it works

1. Subscribe to this server
2. Enter your Voiceflow API Key and Version ID
3. Start managing your conversational ecosystem from Claude, Cursor, or any MCP-compatible client

Who is this for?

  • Conversation Designers — quickly test agent responses and query the knowledge base via simple AI commands.
  • AI Developers — debug user states and inspect transcripts during the development and testing cycle.
  • Product Managers — monitor user feedback and conversation logs directly from the workspace.

Built-in capabilities (12)

delete_state

Reset user session

get_feedback

Get user feedback

get_project

Get project details

get_state

Get user conversation state

get_transcript

Get transcript details

interact

Send message to Voiceflow agent

list_kb_docs

List KB documents

list_kb_tags

List KB document tags

list_projects

List Voiceflow projects

list_transcripts

List conversation transcripts

query_kb

Ask the Knowledge Base

save_state

Update user state/variables

Why LlamaIndex?

LlamaIndex agents combine Voiceflow tool responses with indexed documents for comprehensive, grounded answers. Connect 12 tools through Vinkius and query live data alongside vector stores and SQL databases in a single turn. ideal for hybrid search, data enrichment, and analytical workflows.

  • Data-first architecture: LlamaIndex agents combine Voiceflow tool responses with indexed documents for comprehensive, grounded answers

  • Query pipeline framework lets you chain Voiceflow tool calls with transformations, filters, and re-rankers in a typed pipeline

  • Multi-source reasoning: agents can query Voiceflow, a vector store, and a SQL database in a single turn and synthesize results

  • Observability integrations show exactly what Voiceflow tools were called, what data was returned, and how it influenced the final answer

L
See it in action

Voiceflow in LlamaIndex

AI AgentVinkius
High Security·Kill Switch·Plug and Play
Why Vinkius

Voiceflow and 3,400+ other MCP servers. One platform. One governance layer.

Teams that connect Voiceflow to LlamaIndex through Vinkius don't need to source, host, or maintain individual MCP servers. Every tool call runs inside a hardened runtime with credential isolation, DLP, and a signed audit chain.

3,400+MCP Servers ready
<40msCold start
60%Token savings
Raw MCP
Vinkius
Server catalogFind and host yourself3,400+ managed
InfrastructureSelf-hostedSandboxed V8 isolates
Credential handlingPlaintext in configVault + runtime injection
Data loss preventionNoneConfigurable DLP policies
Kill switchNoneGlobal instant shutdown
Financial circuit breakersNonePer-server limits + alerts
Audit trailNoneEd25519 signed logs
SIEM log streamingNoneSplunk, Datadog, Webhook
HoneytokensNoneCanary alerts on leak
Custom domainsNot applicableDNS challenge verified
GDPR complianceManual effortAutomated purge + export
Enterprise Security

Why teams choose Vinkius for Voiceflow in LlamaIndex

The Voiceflow MCP Server runs on Vinkius-managed infrastructure inside AWS — a purpose-built runtime with per-request V8 isolates, Ed25519 signed audit chains, and sub-40ms cold starts. All 12 tools execute in hardened sandboxes optimized for native MCP execution.

Your AI agents in LlamaIndex only access the data you authorize, with DLP that blocks sensitive information from ever reaching the model, kill switch for instant shutdown, and up to 60% token savings. Enterprise-grade infrastructure, zero maintenance.

Voiceflow
Fully ManagedVinkius Servers
60%Token savings
High SecurityEnterprise-grade
IAMAccess control
EU AI ActCompliant
DLPData protection
V8 IsolateSandboxed
Ed25519Audit chain
<40msKill switch
Stream every event to Splunk, Datadog, or your own webhook in real-time

* Every MCP server runs on Vinkius-managed infrastructure inside AWS - a purpose-built runtime with per-request V8 isolates, Ed25519 signed audit chains, and sub-40ms cold starts optimized for native MCP execution. See our infrastructure

The Vinkius Advantage

How Vinkius secures Voiceflow for LlamaIndex

Every tool call from LlamaIndex to the Voiceflow MCP Server is protected by DLP redaction, cryptographic audit chains, V8 sandbox isolation, kill switch, and financial circuit breakers.

< 40msCold start
Ed25519Signed audit chain
60%Token savings
FAQ

Frequently asked questions

01

Can I query my Voiceflow Knowledge Base directly via AI?

Yes! Use the query_kb tool with your question. Your agent will trigger the Voiceflow RAG system and return the answer based on your uploaded documents.

02

How do I see the transcripts for a specific project?

Run the list_transcripts query with your Project ID. The agent will return a list of past conversation logs, which you can then inspect using get_transcript.

03

Is it possible to reset a user's session via AI?

Absolutely. Use the delete_state tool and provide the User ID. This will permanently clear the conversation history and variables for that specific session.

04

How does LlamaIndex connect to MCP servers?

Use the MCP client adapter to create a connection. LlamaIndex discovers all tools and wraps them as query engine tools compatible with any LlamaIndex agent.

05

Can I combine MCP tools with vector stores?

Yes. LlamaIndex agents can query Voiceflow tools and vector store indexes in the same turn, combining real-time and embedded data for grounded responses.

06

Does LlamaIndex support async MCP calls?

Yes. LlamaIndex's async agent framework supports concurrent MCP tool calls for high-throughput data processing pipelines.

07

BasicMCPClient not found

Install: pip install llama-index-tools-mcp