Kong (AI API Gateway) MCP Server
Manage your API Gateway via Kong — orchestrate services, routes, and AI plugins directly from your agent.
Ask AI about this MCP Server
Vinkius supports streamable HTTP and SSE.

* Every MCP server runs on Vinkius-managed infrastructure inside AWS - a purpose-built runtime with per-request V8 isolates, Ed25519 signed audit chains, and sub-40ms cold starts optimized for native MCP execution. See our infrastructure
What is the Kong MCP Server?
The Kong MCP Server gives AI agents like Claude, ChatGPT, and Cursor direct access to Kong via 10 tools. Manage your API Gateway via Kong — orchestrate services, routes, and AI plugins directly from your agent. Powered by the Vinkius - no API keys, no infrastructure, connect in under 2 minutes.
Built-in capabilities (10)
Tools for your AI Agents to operate Kong
Ask your AI agent "List all registered services in my Kong Gateway" and get the answer without opening a single dashboard. With 10 tools connected to real Kong data, your agents reason over live information, cross-reference it with other MCP servers, and deliver insights you would spend hours assembling manually.
Works with Claude, ChatGPT, Cursor, and any MCP-compatible client. Powered by the Vinkius - your credentials never touch the AI model, every request is auditable. Connect in under two minutes.
Why teams choose Vinkius
One subscription gives you access to thousands of MCP servers - and you can deploy your own to the Vinkius Edge. Your AI agents only access the data you authorize, with DLP that blocks sensitive information from ever reaching the model, kill switch for instant shutdown, and up to 60% token savings. Enterprise-grade infrastructure and security, zero maintenance.
Build your own MCP Server with our secure development framework →Vinkius works with every AI agent you already use
…and any MCP-compatible client


















Kong (AI API Gateway) MCP Server capabilities
10 toolsFrequently used for enabling the `ai-proxy` plugin for LLM routing and key encapsulation. Apply a new Plugin (like AI Proxy) to a specific Service
Generate an API Key credential for a Kong Consumer
Create a new Route to expose a Service in Kong
The payload must define the upstream URL, name, and protocol information. Create a new backend Service in Kong
Delete and permanently remove a Plugin from the Kong Gateway
List all Consumer profiles registered in Kong
g., Rate Limiting, AI Proxy, Key Auth) currently configured globally or scoped to specific Services/Routes. List all enabled Plugins on the Kong Gateway
List all routing rules configured in the Kong API Gateway
List all Services registered in the Kong API Gateway
Useful for adjusting rate limits dynamically or swapping AI model providers under heavy load. Update the configuration of an existing Kong Plugin
What the Kong (AI API Gateway) MCP Server unlocks
Connect your Kong API Gateway instance to any AI agent and take full control of your API lifecycle and AI traffic management through natural conversation.
What you can do
- Service Orchestration — List backend services and create new upstream definitions defining URLs and protocols directly from your agent
- Route Management — Configure inbound routing rules to map client requests to backend services based on specific paths or hostnames
- AI Plugin Control — Apply and configure the
ai-proxyplugin to enable LLM routing, model providers, and key encapsulation securely - Operational Patching — Update existing plugin configurations in real-time, allowing you to adjust rate limits or swap AI models dynamically
- Consumer CRM — Manage consumer profiles and generate API keys for
key-authplugins to track specific user or tenant usage - Infrastructure Audit — Discover enabled plugins across your gateway and remove unused modules instantly to maintain a clean proxy pipeline
How it works
1. Subscribe to this server
2. Enter your Kong Admin URL and Admin Token
3. Start managing your API connectivity from Claude, Cursor, or any MCP-compatible client
Who is this for?
- Platform Engineers — manage global gateway configurations and audit enabled plugins through natural conversation
- Backend Developers — create new services and routes without leaving your terminal or switching to the Kong Manager UI
- AI Ops Teams — monitor and adjust AI Proxy settings to optimize LLM usage and provider costs across the organization
Frequently asked questions about the Kong (AI API Gateway) MCP Server
Can I use this server to manage Kong's AI Proxy plugin?
Absolutely. The create_ai_plugin tool is specifically designed to inject the ai-proxy plugin onto Services. You can define providers like OpenAI or Anthropic and manage model routing directly through your agent.
How do I create a new API route through a conversation?
Use the create_route tool and provide a JSON payload defining the paths and the Service ID it should point to. Your agent will handle the Admin API call to provision the routing rule instantly.
Can my agent generate new API keys for consumers?
Yes. The create_consumer_key tool allows you to provision new credentials for specific Consumers. This is perfect for onboarding new tenants or rotating keys for downstream applications securely.
More in this category
You might also like
Connect Kong (AI API Gateway) with your favorite client
Step-by-step setup guides for every MCP-compatible client and framework:
Anthropic's native desktop app for Claude with built-in MCP support.
AI-first code editor with integrated LLM-powered coding assistance.
GitHub Copilot in VS Code with Agent mode and MCP support.
Purpose-built IDE for agentic AI coding workflows.
Autonomous AI coding agent that runs inside VS Code.
Anthropic's agentic CLI for terminal-first development.
Python SDK for building production-grade OpenAI agent workflows.
Google's framework for building production AI agents.
Type-safe agent development for Python with first-class MCP support.
TypeScript toolkit for building AI-powered web applications.
TypeScript-native agent framework for modern web stacks.
Python framework for orchestrating collaborative AI agent crews.
Leading Python framework for composable LLM applications.
Data-aware AI agent framework for structured and unstructured sources.
Microsoft's framework for multi-agent collaborative conversations.
Give your AI agents the power of Kong MCP Server
Production-grade Kong (AI API Gateway) MCP Server. Verified, monitored, and maintained by Vinkius. Ready for your AI agents — connect and start using immediately.






