Bring Lead Management
to LlamaIndex
Learn how to connect Richards CRM to LlamaIndex and start using 11 AI agent tools in minutes. Fully managed, enterprise secure, and ready to use without writing a single line of code.
What is the Richards CRM MCP Server?
Connect your Richards Building Supply CRM account to any AI agent and take full control of your exterior contracting orchestration through natural conversation. Powered by the Construct CRM engine, this integration allows you to retrieve lead metadata, manage project lifecycles, and orchestrate material orders directly from your chat interface.
What you can do
- Lead & Prospect Orchestration — List all managed leads and retrieve detailed profile metadata, including creating new contacts programmatically to ensure your pipeline is always synchronized.
- Project & Job Lifecycle Management — Access and monitor active projects and retrieve detailed technical metadata directly from the AI interface to keep your exterior jobs on track.
- Estimating & Proposals Intelligence — Access project estimates and monitor proposal statuses to ensure your bidding process is always optimized via natural language.
- Material Ordering & Invoice Control — List and retrieve order metadata and invoices to maintain a clear overview of your supply chain and business expenses.
- Operational Monitoring — Track employee assignments and manage company profile metadata using simple AI commands.
How it works
1. Subscribe to this server
2. Enter your Richards CRM API Key from your account settings
3. Start managing your contracting workflows from Claude, Cursor, or any MCP-compatible client
No more manual order tracking or switching between Material and CRM apps. Your AI acts as a dedicated project coordinator or operations lead.
Who is this for?
- Exterior Contractors & Foremen — quickly retrieve project details and monitor lead status without switching apps.
- Operations Managers — automate the management of material orders and track project history via natural conversation.
- Sales Teams — streamline the retrieval of estimates and monitor proposal health directly within the chat.
Built-in capabilities (11)
Create a new lead
Get company profile details
Get details for a specific lead
Get details of a specific project
List all employees
List all estimates
List all invoices
List all leads
List all purchase orders
List all projects
List all proposals
Why LlamaIndex?
LlamaIndex agents combine Richards CRM tool responses with indexed documents for comprehensive, grounded answers. Connect 11 tools through Vinkius and query live data alongside vector stores and SQL databases in a single turn. ideal for hybrid search, data enrichment, and analytical workflows.
- —
Data-first architecture: LlamaIndex agents combine Richards CRM tool responses with indexed documents for comprehensive, grounded answers
- —
Query pipeline framework lets you chain Richards CRM tool calls with transformations, filters, and re-rankers in a typed pipeline
- —
Multi-source reasoning: agents can query Richards CRM, a vector store, and a SQL database in a single turn and synthesize results
- —
Observability integrations show exactly what Richards CRM tools were called, what data was returned, and how it influenced the final answer
Richards CRM in LlamaIndex
Richards CRM and 3,400+ other MCP servers. One platform. One governance layer.
Teams that connect Richards CRM to LlamaIndex through Vinkius don't need to source, host, or maintain individual MCP servers. Every tool call runs inside a hardened runtime with credential isolation, DLP, and a signed audit chain.
Raw MCP | Vinkius | |
|---|---|---|
| Server catalog | Find and host yourself | 3,400+ managed |
| Infrastructure | Self-hosted | Sandboxed V8 isolates |
| Credential handling | Plaintext in config | Vault + runtime injection |
| Data loss prevention | None | Configurable DLP policies |
| Kill switch | None | Global instant shutdown |
| Financial circuit breakers | None | Per-server limits + alerts |
| Audit trail | None | Ed25519 signed logs |
| SIEM log streaming | None | Splunk, Datadog, Webhook |
| Honeytokens | None | Canary alerts on leak |
| Custom domains | Not applicable | DNS challenge verified |
| GDPR compliance | Manual effort | Automated purge + export |
Why teams choose Vinkius for Richards CRM in LlamaIndex
The Richards CRM MCP Server runs on Vinkius-managed infrastructure inside AWS — a purpose-built runtime with per-request V8 isolates, Ed25519 signed audit chains, and sub-40ms cold starts. All 11 tools execute in hardened sandboxes optimized for native MCP execution.
Your AI agents in LlamaIndex only access the data you authorize, with DLP that blocks sensitive information from ever reaching the model, kill switch for instant shutdown, and up to 60% token savings. Enterprise-grade infrastructure, zero maintenance.

* Every MCP server runs on Vinkius-managed infrastructure inside AWS - a purpose-built runtime with per-request V8 isolates, Ed25519 signed audit chains, and sub-40ms cold starts optimized for native MCP execution. See our infrastructure
How Vinkius secures
Richards CRM for LlamaIndex
Every tool call from LlamaIndex to the Richards CRM MCP Server is protected by DLP redaction, cryptographic audit chains, V8 sandbox isolation, kill switch, and financial circuit breakers.
Frequently asked questions
Can my AI automatically find the details and status for a specific project just by providing its ID?
Yes! Use the get_project tool with the Project ID. Your agent will respond with complete metadata, including job status, linked leads, and material requirements in seconds.
How do I find my Richards CRM API Key?
Log in to your Richards CRM account, navigate to the integration or API settings section, and you will find your unique secret API key there.
How does LlamaIndex connect to MCP servers?
Use the MCP client adapter to create a connection. LlamaIndex discovers all tools and wraps them as query engine tools compatible with any LlamaIndex agent.
Can I combine MCP tools with vector stores?
Yes. LlamaIndex agents can query Richards CRM tools and vector store indexes in the same turn, combining real-time and embedded data for grounded responses.
Does LlamaIndex support async MCP calls?
Yes. LlamaIndex's async agent framework supports concurrent MCP tool calls for high-throughput data processing pipelines.
BasicMCPClient not found
Install: pip install llama-index-tools-mcp
