Bring Resource Planning
to LlamaIndex
Learn how to connect Dime.Scheduler to LlamaIndex and start using 7 AI agent tools in minutes. Fully managed, enterprise secure, and ready to use without writing a single line of code.
What is the Dime.Scheduler MCP Server?
Connect your Dime.Scheduler account to any AI agent and take full control of your resource orchestration and project scheduling workflows through natural conversation.
What you can do
- Job Orchestration — List and manage planning jobs programmatically, retrieving detailed metadata about parent entities and project requirements
- Task Lifecycle Management — Access and track individual units of work (tasks) that need to be scheduled across your resources in real-time
- Appointment Monitoring — List and inspect all appointments on the graphical planning board to maintain a high-fidelity overview of scheduled activities
- Resource Optimization — Retrieve complete directories of planable resources (people, equipment, tools) to understand team availability and capacity
- Category & Marker Intelligence — Access planning categories and time markers directly through your agent to keep your scheduling board perfectly organized
How it works
1. Subscribe to this server
2. Retrieve your X-API-KEY from your Dime.Scheduler instance settings
3. Start managing your resource planning from Claude, Cursor, or any MCP client
No more manual toggling between complex planning boards or digging through task lists. Your AI acts as your dedicated resource coordinator and scheduling strategist.
Who is this for?
- Project Managers — instantly retrieve job statuses and check task planning across multiple resources using natural language commands
- Resource Coordinators — monitor team availability and appointment loads without leaving your communication tools
- Operations Leads — track scheduled equipment and maintain board organization through simple AI queries
Built-in capabilities (7)
Get job details
List all appointments on the planning board
List all planning categories
Scheduler. List all planning jobs
List all planable resources
List all planning tasks
List available time markers
Why LlamaIndex?
LlamaIndex agents combine Dime.Scheduler tool responses with indexed documents for comprehensive, grounded answers. Connect 7 tools through Vinkius and query live data alongside vector stores and SQL databases in a single turn. ideal for hybrid search, data enrichment, and analytical workflows.
- —
Data-first architecture: LlamaIndex agents combine Dime.Scheduler tool responses with indexed documents for comprehensive, grounded answers
- —
Query pipeline framework lets you chain Dime.Scheduler tool calls with transformations, filters, and re-rankers in a typed pipeline
- —
Multi-source reasoning: agents can query Dime.Scheduler, a vector store, and a SQL database in a single turn and synthesize results
- —
Observability integrations show exactly what Dime.Scheduler tools were called, what data was returned, and how it influenced the final answer
Dime.Scheduler in LlamaIndex
Dime.Scheduler and 3,400+ other MCP servers. One platform. One governance layer.
Teams that connect Dime.Scheduler to LlamaIndex through Vinkius don't need to source, host, or maintain individual MCP servers. Every tool call runs inside a hardened runtime with credential isolation, DLP, and a signed audit chain.
Raw MCP | Vinkius | |
|---|---|---|
| Server catalog | Find and host yourself | 3,400+ managed |
| Infrastructure | Self-hosted | Sandboxed V8 isolates |
| Credential handling | Plaintext in config | Vault + runtime injection |
| Data loss prevention | None | Configurable DLP policies |
| Kill switch | None | Global instant shutdown |
| Financial circuit breakers | None | Per-server limits + alerts |
| Audit trail | None | Ed25519 signed logs |
| SIEM log streaming | None | Splunk, Datadog, Webhook |
| Honeytokens | None | Canary alerts on leak |
| Custom domains | Not applicable | DNS challenge verified |
| GDPR compliance | Manual effort | Automated purge + export |
Why teams choose Vinkius for Dime.Scheduler in LlamaIndex
The Dime.Scheduler MCP Server runs on Vinkius-managed infrastructure inside AWS — a purpose-built runtime with per-request V8 isolates, Ed25519 signed audit chains, and sub-40ms cold starts. All 7 tools execute in hardened sandboxes optimized for native MCP execution.
Your AI agents in LlamaIndex only access the data you authorize, with DLP that blocks sensitive information from ever reaching the model, kill switch for instant shutdown, and up to 60% token savings. Enterprise-grade infrastructure, zero maintenance.

* Every MCP server runs on Vinkius-managed infrastructure inside AWS - a purpose-built runtime with per-request V8 isolates, Ed25519 signed audit chains, and sub-40ms cold starts optimized for native MCP execution. See our infrastructure
How Vinkius secures
Dime.Scheduler for LlamaIndex
Every tool call from LlamaIndex to the Dime.Scheduler MCP Server is protected by DLP redaction, cryptographic audit chains, V8 sandbox isolation, kill switch, and financial circuit breakers.
Frequently asked questions
How do I find my Dime.Scheduler API Key?
Log in to your Dime.Scheduler instance and navigate to Settings > API to generate or copy your unique X-API-KEY.
What is the difference between a job and a task?
A job is the parent project or order, while a task is the specific unit of work that is scheduled on the planning board.
Can I see real-time team availability?
Yes! The list_resources and list_appointments tools allow your agent to identify open slots and currently scheduled work.
How does LlamaIndex connect to MCP servers?
Use the MCP client adapter to create a connection. LlamaIndex discovers all tools and wraps them as query engine tools compatible with any LlamaIndex agent.
Can I combine MCP tools with vector stores?
Yes. LlamaIndex agents can query Dime.Scheduler tools and vector store indexes in the same turn, combining real-time and embedded data for grounded responses.
Does LlamaIndex support async MCP calls?
Yes. LlamaIndex's async agent framework supports concurrent MCP tool calls for high-throughput data processing pipelines.
BasicMCPClient not found
Install: pip install llama-index-tools-mcp
