2,500+ MCP servers ready to use
Vinkius

Cloudflare MCP Server for CrewAI 25 tools — connect in under 2 minutes

Built by Vinkius GDPR 25 Tools Framework

Connect your CrewAI agents to Cloudflare through the Vinkius — pass the Edge URL in the `mcps` parameter and every Cloudflare tool is auto-discovered at runtime. No credentials to manage, no infrastructure to maintain.

Vinkius supports streamable HTTP and SSE.

python
from crewai import Agent, Task, Crew

agent = Agent(
    role="Cloudflare Specialist",
    goal="Help users interact with Cloudflare effectively",
    backstory=(
        "You are an expert at leveraging Cloudflare tools "
        "for automation and data analysis."
    ),
    # Your Vinkius token — get it at cloud.vinkius.com
    mcps=["https://edge.vinkius.com/[YOUR_TOKEN_HERE]/mcp"],
)

task = Task(
    description=(
        "Explore all available tools in Cloudflare "
        "and summarize their capabilities."
    ),
    agent=agent,
    expected_output=(
        "A detailed summary of 25 available tools "
        "and what they can do."
    ),
)

crew = Crew(agents=[agent], tasks=[task])
result = crew.kickoff()
print(result)
Cloudflare
Fully ManagedVinkius Servers
60%Token savings
High SecurityEnterprise-grade
IAMAccess control
EU AI ActCompliant
DLPData protection
V8 IsolateSandboxed
Ed25519Audit chain
<40msKill switch
Stream every event to Splunk, Datadog, or your own webhook in real-time

* Every MCP server runs on Vinkius-managed infrastructure inside AWS - a purpose-built runtime with per-request V8 isolates, Ed25519 signed audit chains, and sub-40ms cold starts optimized for native MCP execution. See our infrastructure

About Cloudflare MCP Server

What you can do

Connect AI agents to Cloudflare's platform for comprehensive edge infrastructure management:

When paired with CrewAI, Cloudflare becomes a first-class tool in your multi-agent workflows. Each agent in the crew can call Cloudflare tools autonomously — one agent queries data, another analyzes results, a third compiles reports — all orchestrated through the Vinkius with zero configuration overhead.

  • Manage Workers — list, inspect, delete serverless functions across your account
  • Control deployments — version history, immediate/gradual rollouts, rollback capabilities
  • Manage secrets — create, list, and delete encrypted environment secrets securely
  • Configure routes — URL patterns that trigger Workers at specific paths or domains
  • Query KV storage — read/write key-value pairs from Workers KV namespaces
  • Execute D1 queries — run SQL queries against Cloudflare's serverless SQLite databases
  • Inspect R2 buckets — list and manage object storage buckets
  • Monitor analytics — zone traffic, Worker invocations, CPU usage, and error rates
  • Tail Worker logs — create real-time logging sessions for debugging in production
  • Purge CDN cache — clear cached content to serve fresh origin data

The Cloudflare MCP Server exposes 25 tools through the Vinkius. Connect it to CrewAI in under two minutes — no API keys to rotate, no infrastructure to provision, no vendor lock-in. Your configuration, your data, your control.

How to Connect Cloudflare to CrewAI via MCP

Follow these steps to integrate the Cloudflare MCP Server with CrewAI.

01

Install CrewAI

Run pip install crewai

02

Replace the token

Replace [YOUR_TOKEN_HERE] with your Vinkius token from cloud.vinkius.com

03

Customize the agent

Adjust the role, goal, and backstory to fit your use case

04

Run the crew

Run python crew.py — CrewAI auto-discovers 25 tools from Cloudflare

Why Use CrewAI with the Cloudflare MCP Server

CrewAI Multi-Agent Orchestration Framework provides unique advantages when paired with Cloudflare through the Model Context Protocol.

01

Multi-agent collaboration lets you decompose complex workflows into specialized roles — one agent researches, another analyzes, a third generates reports — each with access to MCP tools

02

CrewAI's native MCP integration requires zero adapter code: pass the Vinkius Edge URL directly in the `mcps` parameter and agents auto-discover every available tool at runtime

03

Built-in task delegation and shared memory mean agents can pass context between steps without manual state management, enabling multi-hop reasoning across tool calls

04

Sequential and hierarchical crew patterns map naturally to real-world workflows: enumerate subdomains → analyze DNS history → check WHOIS records → compile findings into actionable reports

Cloudflare + CrewAI Use Cases

Practical scenarios where CrewAI combined with the Cloudflare MCP Server delivers measurable value.

01

Automated multi-step research: a reconnaissance agent queries Cloudflare for raw data, then a second analyst agent cross-references findings and flags anomalies — all without human handoff

02

Scheduled intelligence reports: set up a crew that periodically queries Cloudflare, analyzes trends over time, and generates executive briefings in markdown or PDF format

03

Multi-source enrichment pipelines: chain Cloudflare tools with other MCP servers in the same crew, letting agents correlate data across multiple providers in a single workflow

04

Compliance and audit automation: a compliance agent queries Cloudflare against predefined policy rules, generates deviation reports, and routes findings to the appropriate team

Cloudflare MCP Tools for CrewAI (25)

These 25 tools become available when you connect Cloudflare to CrewAI via MCP:

01

create_deployment

Strategy can be immediate (100% traffic immediately) or gradual (percentage-based rollout). Requires script name, version ID, and deployment strategy. Use this to roll out new features, rollback to previous versions, or perform canary deployments. Deploy a specific Worker version to traffic

02

create_secret

Secrets are encrypted at rest and injected at runtime. Requires script name, secret name, and secret value. Common use: API keys, database passwords, OAuth tokens. The secret becomes available via env.VARIABLE_NAME in your Worker code. Create or update a secret for a Cloudflare Worker

03

create_tail_session

log() output and exceptions. Returns a tail ID and WebSocket URL for streaming logs. Use this for debugging Workers in production or monitoring error output. Create a tail logging session for a Cloudflare Worker

04

create_worker_route

Requires zone ID, URL pattern (e.g., "example.com/api/*"), and script name. Use this to expose your Worker at specific URL paths or domains. Create a new route pattern for a Cloudflare Worker

05

delete_secret

Use this to clean up unused secrets or rotate credentials. Requires script name and secret name. After deletion, the Worker will no longer have access to the secret value. Delete a secret from a Cloudflare Worker

06

delete_tail_session

Requires script name and tail ID. Use this to clean up unused tail sessions when debugging is complete. Delete a tail logging session for a Cloudflare Worker

07

delete_worker

This action cannot be undone. Requires the script name. Confirm with the user before proceeding. Delete a Cloudflare Worker script and all its associated resources

08

delete_worker_route

Use this to stop serving a Worker at specific URLs. Requires zone ID and route ID. Delete a route pattern from a Cloudflare Worker

09

get_kv_key

Returns the raw value as JSON. Use this to read configuration values, cached responses, or user data stored in KV. Get the value of a specific key in a KV namespace

10

get_worker

Requires the script name from list_workers results. Use this to review Worker configuration before making updates or debugging. Get detailed information about a specific Cloudflare Worker

11

get_worker_analytics

Returns data for recent invocations. Use this to monitor Worker performance, identify errors, or track usage trends. Get analytics data for a specific Cloudflare Worker

12

get_worker_version

Requires script name and version ID from list_worker_versions results. Use this to audit version contents or prepare for rollback deployment. Get detailed information about a specific Worker version

13

get_zone_analytics

Returns aggregated data for the last 24 hours. Use this to monitor traffic patterns, identify spikes, or measure CDN performance. Get analytics data for a specific Cloudflare zone

14

list_d1_databases

Returns database IDs, names, creation dates, and file sizes. Use this to identify available databases before querying. List all D1 databases in your Cloudflare account

15

list_deployments

Returns deployment IDs, version IDs, strategies (immediate, gradual), creation dates, and traffic percentages. Use this to review current deployment state, monitor gradual rollouts, or identify which version is live. List all deployments for a specific Cloudflare Worker

16

list_kv_keys

Returns key names, expiration metadata, and sizes. Use this to audit stored data or find specific keys before reading values. List all keys in a specific KV namespace

17

list_kv_namespaces

KV namespaces are key-value stores for Workers. Returns namespace IDs, titles, and creation dates. Use this to identify which namespaces exist before reading/writing data. List all KV namespaces in your Cloudflare account

18

list_r2_buckets

Returns bucket names, creation dates, and storage locations. Use this to identify available storage buckets before managing objects. List all R2 storage buckets in your Cloudflare account

19

list_secrets

Returns secret names and types (secret_text, secret_key). Secret values are never returned for security. Use this to audit which secrets are configured before adding new ones or cleaning up unused secrets. List all secrets for a specific Cloudflare Worker

20

list_worker_routes

Returns route patterns, associated script names, and zone IDs. Use this to understand which URLs invoke your Worker before adding or removing routes. List all route patterns associated with a Cloudflare Worker

21

list_worker_versions

Each version represents a deployed code snapshot with unique ID, creation date, and metadata. Returns version IDs, timestamps, and author information. Use this to review deployment history, rollback to previous versions, or audit code changes. List all versions of a specific Cloudflare Worker

22

list_workers

Returns script names, creation dates, modification dates, and deployment status. Use this as the first step to identify which Workers exist before managing versions, deployments, or secrets. List all Cloudflare Workers scripts in your account

23

list_zones

Returns zone IDs, domain names, status, plan, and name servers. Use this to identify zone IDs needed for Worker routes, DNS management, or cache operations. List all DNS zones in your Cloudflare account

24

purge_cache

Use this after deploying content changes or updating static assets. Requires zone ID. Purge all cached content for a specific zone

25

query_d1

Supports SELECT, INSERT, UPDATE, DELETE operations. Returns query results as JSON. Use this for data analysis, migrations, or ad-hoc queries. Requires database ID and SQL query string. Execute a SQL query against a D1 database

Example Prompts for Cloudflare in CrewAI

Ready-to-use prompts you can give your CrewAI agent to start working with Cloudflare immediately.

01

"List all serverless Cloudflare Workers deployed natively bound to my account."

02

"Query the KV namespace assigned to 'production_keys' and extract the specific text mapping 'gateway_url'."

03

"Check error statistics on my main D1 SQLite database instance over the last 24 hours."

Troubleshooting Cloudflare MCP Server with CrewAI

Common issues when connecting Cloudflare to CrewAI through the Vinkius, and how to resolve them.

01

MCP tools not discovered

Ensure the Edge URL is correct. CrewAI connects lazily when the crew starts — check console output.
02

Agent not using tools

Make the task description specific. Instead of "do something", say "Use the available tools to list contacts".
03

Timeout errors

CrewAI has a 10s connection timeout by default. Ensure your network can reach the Edge URL.
04

Rate limiting or 429 errors

The Vinkius enforces per-token rate limits. Check your subscription tier and request quota in the dashboard. Upgrade if you need higher throughput.

Cloudflare + CrewAI FAQ

Common questions about integrating Cloudflare MCP Server with CrewAI.

01

How does CrewAI discover and connect to MCP tools?

CrewAI connects to MCP servers lazily — when the crew starts, each agent resolves its MCP URLs and fetches the tool catalog via the standard tools/list method. This means tools are always fresh and reflect the server's current capabilities. No tool schemas need to be hardcoded.
02

Can different agents in the same crew use different MCP servers?

Yes. Each agent has its own mcps list, so you can assign specific servers to specific roles. For example, a reconnaissance agent might use a domain intelligence server while an analysis agent uses a vulnerability database server.
03

What happens when an MCP tool call fails during a crew run?

CrewAI wraps tool failures as context for the agent. The LLM receives the error message and can decide to retry with different parameters, fall back to a different tool, or mark the task as partially complete. This resilience is critical for production workflows.
04

Can CrewAI agents call multiple MCP tools in parallel?

CrewAI agents execute tool calls sequentially within a single reasoning step. However, you can run multiple agents in parallel using process=Process.parallel, each calling different MCP tools concurrently. This is ideal for workflows where separate data sources need to be queried simultaneously.
05

Can I run CrewAI crews on a schedule (cron)?

Yes. CrewAI crews are standard Python scripts, so you can invoke them via cron, Airflow, Celery, or any task scheduler. The crew.kickoff() method runs synchronously by default, making it straightforward to integrate into existing pipelines.

Connect Cloudflare to CrewAI

Get your token, paste the configuration, and start using 25 tools in under 2 minutes. No API key management needed.