Bring Project Management
to LlamaIndex
Learn how to connect COR to LlamaIndex and start using 13 AI agent tools in minutes. Fully managed, enterprise secure, and ready to use without writing a single line of code.
What is the COR MCP Server?
Connect your COR account to any AI agent and take full control of your professional services project management and profitability orchestration through natural conversation.
What you can do
- Project Portfolio Orchestration — List all active projects, retrieve detailed high-fidelity status metadata, and access profitability metrics programmatically
- Task Pipeline Intelligence — Query tasks for any project, retrieve detailed technical metadata, and stay on top of your team's operational delivery in real-time
- Profitability Monitoring — Access high-fidelity financial insights and project health metrics to ensure sustainable growth directly through your agent
- Time Tracking Discovery — Access recorded technical time entries to understand workload distribution and project efficiency across your organization
- Resource Architecture — List team members, teams, and user profiles to understand and orchestrate your organizational structure programmatically
- Client Database Access — Query the complete high-fidelity directory of client organizations to maintain perfect contextual alignment for every project
How it works
1. Subscribe to this server
2. Retrieve your Personal API Token from your COR account (Settings > API Tokens)
3. Start managing your professional services growth from Claude, Cursor, or any MCP client
No more manual status updates or missing profitability gaps. Your AI acts as your dedicated project coordinator and profitability architect.
Who is this for?
- Project Managers — instantly retrieve task lists and project statuses using natural language commands without leaving your creative workspace
- Agency Leads — monitor high-fidelity profitability metrics and team utilization to ensure healthy business operations
- Operations Managers — verify technical time logs and team assignments to optimize resource allocation through simple AI queries
Built-in capabilities (13)
Check API Status
Create a new project
Get current user details
Get details for a specific project
Get details for a specific task
List customer clients
List COR projects
List defined task types
Optionally filter by project ID to isolate specific technical pipelines. List tasks
List team users
List users in a team
List organization teams
List recorded time entries
Why LlamaIndex?
LlamaIndex agents combine COR tool responses with indexed documents for comprehensive, grounded answers. Connect 13 tools through Vinkius and query live data alongside vector stores and SQL databases in a single turn. ideal for hybrid search, data enrichment, and analytical workflows.
- —
Data-first architecture: LlamaIndex agents combine COR tool responses with indexed documents for comprehensive, grounded answers
- —
Query pipeline framework lets you chain COR tool calls with transformations, filters, and re-rankers in a typed pipeline
- —
Multi-source reasoning: agents can query COR, a vector store, and a SQL database in a single turn and synthesize results
- —
Observability integrations show exactly what COR tools were called, what data was returned, and how it influenced the final answer
COR in LlamaIndex
COR and 3,400+ other MCP servers. One platform. One governance layer.
Teams that connect COR to LlamaIndex through Vinkius don't need to source, host, or maintain individual MCP servers. Every tool call runs inside a hardened runtime with credential isolation, DLP, and a signed audit chain.
Raw MCP | Vinkius | |
|---|---|---|
| Server catalog | Find and host yourself | 3,400+ managed |
| Infrastructure | Self-hosted | Sandboxed V8 isolates |
| Credential handling | Plaintext in config | Vault + runtime injection |
| Data loss prevention | None | Configurable DLP policies |
| Kill switch | None | Global instant shutdown |
| Financial circuit breakers | None | Per-server limits + alerts |
| Audit trail | None | Ed25519 signed logs |
| SIEM log streaming | None | Splunk, Datadog, Webhook |
| Honeytokens | None | Canary alerts on leak |
| Custom domains | Not applicable | DNS challenge verified |
| GDPR compliance | Manual effort | Automated purge + export |
Why teams choose Vinkius for COR in LlamaIndex
The COR MCP Server runs on Vinkius-managed infrastructure inside AWS — a purpose-built runtime with per-request V8 isolates, Ed25519 signed audit chains, and sub-40ms cold starts. All 13 tools execute in hardened sandboxes optimized for native MCP execution.
Your AI agents in LlamaIndex only access the data you authorize, with DLP that blocks sensitive information from ever reaching the model, kill switch for instant shutdown, and up to 60% token savings. Enterprise-grade infrastructure, zero maintenance.

* Every MCP server runs on Vinkius-managed infrastructure inside AWS - a purpose-built runtime with per-request V8 isolates, Ed25519 signed audit chains, and sub-40ms cold starts optimized for native MCP execution. See our infrastructure
How Vinkius secures
COR for LlamaIndex
Every tool call from LlamaIndex to the COR MCP Server is protected by DLP redaction, cryptographic audit chains, V8 sandbox isolation, kill switch, and financial circuit breakers.
Frequently asked questions
How do I find my COR API Token?
Log in to your account, navigate to Personal Settings > API Tokens, and generate a new high-fidelity Personal API Token.
Can I check project profitability via AI?
Yes! The get_cor_project tool allows your agent to retrieve high-fidelity profitability metrics and financial health data for any specific project.
How do I list my organization's teams?
Use the list_cor_teams tool to retrieve the complete high-fidelity directory of teams along with their unique identifiers for precise orchestration.
How does LlamaIndex connect to MCP servers?
Use the MCP client adapter to create a connection. LlamaIndex discovers all tools and wraps them as query engine tools compatible with any LlamaIndex agent.
Can I combine MCP tools with vector stores?
Yes. LlamaIndex agents can query COR tools and vector store indexes in the same turn, combining real-time and embedded data for grounded responses.
Does LlamaIndex support async MCP calls?
Yes. LlamaIndex's async agent framework supports concurrent MCP tool calls for high-throughput data processing pipelines.
BasicMCPClient not found
Install: pip install llama-index-tools-mcp
