2,500+ MCP servers ready to use
Vinkius

JokeAPI MCP Server for LlamaIndex 3 tools — connect in under 2 minutes

Built by Vinkius GDPR 3 Tools Framework

LlamaIndex specializes in data-aware AI agents that connect LLMs to structured and unstructured sources. Add JokeAPI as an MCP tool provider through Vinkius and your agents can query, analyze, and act on live data alongside your existing indexes.

Vinkius supports streamable HTTP and SSE.

python
import asyncio
from llama_index.tools.mcp import BasicMCPClient, McpToolSpec
from llama_index.core.agent.workflow import FunctionAgent
from llama_index.llms.openai import OpenAI

async def main():
    # Your Vinkius token. get it at cloud.vinkius.com
    mcp_client = BasicMCPClient("https://edge.vinkius.com/[YOUR_TOKEN_HERE]/mcp")
    mcp_tool_spec = McpToolSpec(client=mcp_client)
    tools = await mcp_tool_spec.to_tool_list_async()

    agent = FunctionAgent(
        tools=tools,
        llm=OpenAI(model="gpt-4o"),
        system_prompt=(
            "You are an assistant with access to JokeAPI. "
            "You have 3 tools available."
        ),
    )

    response = await agent.run(
        "What tools are available in JokeAPI?"
    )
    print(response)

asyncio.run(main())
JokeAPI
Fully ManagedVinkius Servers
60%Token savings
High SecurityEnterprise-grade
IAMAccess control
EU AI ActCompliant
DLPData protection
V8 IsolateSandboxed
Ed25519Audit chain
<40msKill switch
Stream every event to Splunk, Datadog, or your own webhook in real-time

* Every MCP server runs on Vinkius-managed infrastructure inside AWS - a purpose-built runtime with per-request V8 isolates, Ed25519 signed audit chains, and sub-40ms cold starts optimized for native MCP execution. See our infrastructure

About JokeAPI MCP Server

Equip your AI agent with a source of laughter via the JokeAPI MCP server. This integration provides access to a large database of jokes across various categories like Programming, Pun, and Misc. Your agent can retrieve random jokes, filter them by specific categories or languages, and ensure content safety using blacklist flags (nsfw, religious, political, etc.). Whether you're building a fun bot, looking for a quick icebreaker, or just want a laugh, your agent acts as a dedicated comedian through natural conversation.

LlamaIndex agents combine JokeAPI tool responses with indexed documents for comprehensive, grounded answers. Connect 3 tools through Vinkius and query live data alongside vector stores and SQL databases in a single turn. ideal for hybrid search, data enrichment, and analytical workflows.

What you can do

  • Random Jokes — Get a single random joke or a list of multiple jokes.
  • Category Filtering — Choose from Programming, Misc, Pun, Spooky, or Christmas categories.
  • Content Safety — Use flags to filter out jokes that are NSFW, religious, political, or offensive.
  • Language Support — Access jokes in multiple languages including English, German, and Portuguese.
  • Humor Auditing — Summarize multiple jokes to identify popular themes and comedic structures.

The JokeAPI MCP Server exposes 3 tools through the Vinkius. Connect it to LlamaIndex in under two minutes — no API keys to rotate, no infrastructure to provision, no vendor lock-in. Your configuration, your data, your control.

How to Connect JokeAPI to LlamaIndex via MCP

Follow these steps to integrate the JokeAPI MCP Server with LlamaIndex.

01

Install dependencies

Run pip install llama-index-tools-mcp llama-index-llms-openai

02

Replace the token

Replace [YOUR_TOKEN_HERE] with your Vinkius token

03

Run the agent

Save to agent.py and run: python agent.py

04

Explore tools

The agent discovers 3 tools from JokeAPI

Why Use LlamaIndex with the JokeAPI MCP Server

LlamaIndex provides unique advantages when paired with JokeAPI through the Model Context Protocol.

01

Data-first architecture: LlamaIndex agents combine JokeAPI tool responses with indexed documents for comprehensive, grounded answers

02

Query pipeline framework lets you chain JokeAPI tool calls with transformations, filters, and re-rankers in a typed pipeline

03

Multi-source reasoning: agents can query JokeAPI, a vector store, and a SQL database in a single turn and synthesize results

04

Observability integrations show exactly what JokeAPI tools were called, what data was returned, and how it influenced the final answer

JokeAPI + LlamaIndex Use Cases

Practical scenarios where LlamaIndex combined with the JokeAPI MCP Server delivers measurable value.

01

Hybrid search: combine JokeAPI real-time data with embedded document indexes for answers that are both current and comprehensive

02

Data enrichment: query JokeAPI to augment indexed data with live information before generating user-facing responses

03

Knowledge base agents: build agents that maintain and update knowledge bases by periodically querying JokeAPI for fresh data

04

Analytical workflows: chain JokeAPI queries with LlamaIndex's data connectors to build multi-source analytical reports

JokeAPI MCP Tools for LlamaIndex (3)

These 3 tools become available when you connect JokeAPI to LlamaIndex via MCP:

01

get_joke

Get a random joke

02

list_joke_categories

List joke categories

03

list_jokes

Get multiple jokes

Example Prompts for JokeAPI in LlamaIndex

Ready-to-use prompts you can give your LlamaIndex agent to start working with JokeAPI immediately.

01

"Tell me a programming joke."

02

"Give me 3 safe jokes from the 'Misc' category."

03

"Do you have any 'Spooky' jokes for Halloween?"

Troubleshooting JokeAPI MCP Server with LlamaIndex

Common issues when connecting JokeAPI to LlamaIndex through the Vinkius, and how to resolve them.

01

BasicMCPClient not found

Install: pip install llama-index-tools-mcp

JokeAPI + LlamaIndex FAQ

Common questions about integrating JokeAPI MCP Server with LlamaIndex.

01

How does LlamaIndex connect to MCP servers?

Use the MCP client adapter to create a connection. LlamaIndex discovers all tools and wraps them as query engine tools compatible with any LlamaIndex agent.
02

Can I combine MCP tools with vector stores?

Yes. LlamaIndex agents can query JokeAPI tools and vector store indexes in the same turn, combining real-time and embedded data for grounded responses.
03

Does LlamaIndex support async MCP calls?

Yes. LlamaIndex's async agent framework supports concurrent MCP tool calls for high-throughput data processing pipelines.

Connect JokeAPI to LlamaIndex

Get your token, paste the configuration, and start using 3 tools in under 2 minutes. No API key management needed.