Lingyi Wanwu MCP Server
Orchestrate Lingyi Wanwu AI models — manage chat completions, embeddings, and monitor Yi model performance directly from any AI agent.
Ask AI about this MCP Server
Vinkius AI Gateway supports streamable HTTP and SSE.
Works with every AI agent you already use
…and any MCP-compatible client


















What is the Lingyi Wanwu MCP Server?
The Lingyi Wanwu MCP Server gives AI agents like Claude, ChatGPT, and Cursor direct access to Lingyi Wanwu. Orchestrate Lingyi Wanwu AI models — manage chat completions, embeddings, and monitor Yi model performance directly from any AI agent. Powered by the Vinkius AI Gateway — no API keys, no infrastructure, connect in under 2 minutes.
Lingyi Wanwu MCP Server: see your AI Agent in action
Built-in capabilities (5)
chat_completions
Send a message to a Yi model
check_moderation
Check content for policy violations
get_embeddings
Generate text embeddings
get_usage
Retrieve account usage statistics
list_models
List available Yi models
What this connector unlocks
Connect your AI agents to Lingyi Wanwu (01.AI), the high-performance AI lab founded by Dr. Kai-Fu Lee. This MCP provides 10 tools to automate interactions with the Yi series of large language models, including state-of-the-art chat completions, semantic embeddings, and account usage monitoring.
What you can do
- Yi Model Interaction — Trigger chat completions with Yi-34B, Yi-Large, and other optimized models using persistent context
- Vector Embeddings — Generate high-dimensional semantic embeddings to power advanced RAG and search workflows
- Model Intelligence — List all available models and retrieve granular technical specifications for each version
- Account Management — Monitor your token consumption and balance programmatically to optimize costs
How it works
1. Subscribe to this server
2. Log in to the [Lingyi Wanwu Developer Platform](https://platform.lingyiwanwu.com/)
3. Navigate to API Keys and generate a new key
4. Identify the model you wish to use (e.g., yi-large or yi-34b-chat-0205)
5. Insert your API Key into the field below to start managing your Yi model workflows.
Who is this for?
- AI Developers — automate the integration of high-performance bilingual (EN/CN) models into custom apps
- Knowledge Engineers — build RAG pipelines using Lingyi Wanwu's optimized embedding services
- System Integrators — bridge enterprise platforms with the efficient and powerful Yi foundation models
Frequently asked questions
Give your AI agents the power of Lingyi Wanwu
Access Lingyi Wanwu and 2,500+ MCP servers — ready for your agents to use, right now. No glue code. No custom integrations. Just plug Vinkius AI Gateway and let your agents work.
