3,400+ MCP servers ready to use
Vinkius

COR MCP Server for LlamaIndexGive LlamaIndex instant access to 13 tools to Check Cor Status, Create Cor Project, Get Cor Me, and more

Built by Vinkius GDPR 13 Tools Framework

LlamaIndex specializes in data-aware AI agents that connect LLMs to structured and unstructured sources. Add COR as an MCP tool provider through Vinkius and your agents can query, analyze, and act on live data alongside your existing indexes.

Ask AI about this App Connector for LlamaIndex

The COR app connector for LlamaIndex is a standout in the Productivity category — giving your AI agent 13 tools to work with, ready to go from day one.

Vinkius delivers Streamable HTTP and SSE to any MCP client

python
import asyncio
from llama_index.tools.mcp import BasicMCPClient, McpToolSpec
from llama_index.core.agent.workflow import FunctionAgent
from llama_index.llms.openai import OpenAI

async def main():
    # Your Vinkius token. get it at cloud.vinkius.com
    mcp_client = BasicMCPClient("https://edge.vinkius.com/[YOUR_TOKEN_HERE]/mcp")
    mcp_tool_spec = McpToolSpec(client=mcp_client)
    tools = await mcp_tool_spec.to_tool_list_async()

    agent = FunctionAgent(
        tools=tools,
        llm=OpenAI(model="gpt-4o"),
        system_prompt=(
            "You are an assistant with access to COR. "
            "You have 13 tools available."
        ),
    )

    response = await agent.run(
        "What tools are available in COR?"
    )
    print(response)

asyncio.run(main())
COR
Fully ManagedVinkius Servers
60%Token savings
High SecurityEnterprise-grade
IAMAccess control
EU AI ActCompliant
DLPData protection
V8 IsolateSandboxed
Ed25519Audit chain
<40msKill switch
Stream every event to Splunk, Datadog, or your own webhook in real-time

* Every MCP server runs on Vinkius-managed infrastructure inside AWS - a purpose-built runtime with per-request V8 isolates, Ed25519 signed audit chains, and sub-40ms cold starts optimized for native MCP execution. See our infrastructure

About COR MCP Server

Connect your COR account to any AI agent and take full control of your professional services project management and profitability orchestration through natural conversation.

LlamaIndex agents combine COR tool responses with indexed documents for comprehensive, grounded answers. Connect 13 tools through Vinkius and query live data alongside vector stores and SQL databases in a single turn. ideal for hybrid search, data enrichment, and analytical workflows.

What you can do

  • Project Portfolio Orchestration — List all active projects, retrieve detailed high-fidelity status metadata, and access profitability metrics programmatically
  • Task Pipeline Intelligence — Query tasks for any project, retrieve detailed technical metadata, and stay on top of your team's operational delivery in real-time
  • Profitability Monitoring — Access high-fidelity financial insights and project health metrics to ensure sustainable growth directly through your agent
  • Time Tracking Discovery — Access recorded technical time entries to understand workload distribution and project efficiency across your organization
  • Resource Architecture — List team members, teams, and user profiles to understand and orchestrate your organizational structure programmatically
  • Client Database Access — Query the complete high-fidelity directory of client organizations to maintain perfect contextual alignment for every project

The COR MCP Server exposes 13 tools through the Vinkius. Connect it to LlamaIndex in under two minutes — no API keys to rotate, no infrastructure to provision, no vendor lock-in. Your configuration, your data, your control.

All 13 COR tools available for LlamaIndex

When LlamaIndex connects to COR through Vinkius, your AI agent gets direct access to every tool listed below — spanning project-management, profitability, time-tracking, and more. Every call is secured with network, filesystem, subprocess, and code evaluation entitlements inside a sandboxed runtime. Beyond a simple connection, you get a full AI Gateway with real-time visibility into agent activity, enterprise governance, and optimized token usage.

check_cor_status

Check API Status

create_cor_project

Create a new project

get_cor_me

Get current user details

get_cor_project

Get details for a specific project

get_cor_task

Get details for a specific task

list_cor_clients

List customer clients

list_cor_projects

List COR projects

list_cor_task_types

List defined task types

list_cor_tasks

Optionally filter by project ID to isolate specific technical pipelines. List tasks

list_cor_team_members

List team users

list_cor_team_users

List users in a team

list_cor_teams

List organization teams

list_cor_time_entries

List recorded time entries

Connect COR to LlamaIndex via MCP

Follow these steps to wire COR into LlamaIndex. The entire setup takes under two minutes — your credentials stay safe behind the Vinkius.

01

Install dependencies

Run pip install llama-index-tools-mcp llama-index-llms-openai
02

Replace the token

Replace [YOUR_TOKEN_HERE] with your Vinkius token
03

Run the agent

Save to agent.py and run: python agent.py
04

Explore tools

The agent discovers 13 tools from COR

Why Use LlamaIndex with the COR MCP Server

LlamaIndex provides unique advantages when paired with COR through the Model Context Protocol.

01

Data-first architecture: LlamaIndex agents combine COR tool responses with indexed documents for comprehensive, grounded answers

02

Query pipeline framework lets you chain COR tool calls with transformations, filters, and re-rankers in a typed pipeline

03

Multi-source reasoning: agents can query COR, a vector store, and a SQL database in a single turn and synthesize results

04

Observability integrations show exactly what COR tools were called, what data was returned, and how it influenced the final answer

COR + LlamaIndex Use Cases

Practical scenarios where LlamaIndex combined with the COR MCP Server delivers measurable value.

01

Hybrid search: combine COR real-time data with embedded document indexes for answers that are both current and comprehensive

02

Data enrichment: query COR to augment indexed data with live information before generating user-facing responses

03

Knowledge base agents: build agents that maintain and update knowledge bases by periodically querying COR for fresh data

04

Analytical workflows: chain COR queries with LlamaIndex's data connectors to build multi-source analytical reports

Example Prompts for COR in LlamaIndex

Ready-to-use prompts you can give your LlamaIndex agent to start working with COR immediately.

01

"List all active projects and show their status."

02

"Show tasks assigned to project 'COR Integration'."

03

"Check the team members in the 'Development' team."

Troubleshooting COR MCP Server with LlamaIndex

Common issues when connecting COR to LlamaIndex through the Vinkius, and how to resolve them.

01

BasicMCPClient not found

Install: pip install llama-index-tools-mcp

COR + LlamaIndex FAQ

Common questions about integrating COR MCP Server with LlamaIndex.

01

How does LlamaIndex connect to MCP servers?

Use the MCP client adapter to create a connection. LlamaIndex discovers all tools and wraps them as query engine tools compatible with any LlamaIndex agent.
02

Can I combine MCP tools with vector stores?

Yes. LlamaIndex agents can query COR tools and vector store indexes in the same turn, combining real-time and embedded data for grounded responses.
03

Does LlamaIndex support async MCP calls?

Yes. LlamaIndex's async agent framework supports concurrent MCP tool calls for high-throughput data processing pipelines.