2,500+ MCP servers ready to use
Vinkius

Cortex XSIAM MCP Server for Vercel AI SDK 9 tools — connect in under 2 minutes

Built by Vinkius GDPR 9 Tools SDK

The Vercel AI SDK is the TypeScript toolkit for building AI-powered applications. Connect Cortex XSIAM through Vinkius and every tool is available as a typed function. ready for React Server Components, API routes, or any Node.js backend.

Vinkius supports streamable HTTP and SSE.

typescript
import { createMCPClient } from "@ai-sdk/mcp";
import { generateText } from "ai";
import { openai } from "@ai-sdk/openai";

async function main() {
  const mcpClient = await createMCPClient({
    transport: {
      type: "http",
      // Your Vinkius token. get it at cloud.vinkius.com
      url: "https://edge.vinkius.com/[YOUR_TOKEN_HERE]/mcp",
    },
  });

  try {
    const tools = await mcpClient.tools();
    const { text } = await generateText({
      model: openai("gpt-4o"),
      tools,
      prompt: "Using Cortex XSIAM, list all available capabilities.",
    });
    console.log(text);
  } finally {
    await mcpClient.close();
  }
}

main();
Cortex XSIAM
Fully ManagedVinkius Servers
60%Token savings
High SecurityEnterprise-grade
IAMAccess control
EU AI ActCompliant
DLPData protection
V8 IsolateSandboxed
Ed25519Audit chain
<40msKill switch
Stream every event to Splunk, Datadog, or your own webhook in real-time

* Every MCP server runs on Vinkius-managed infrastructure inside AWS - a purpose-built runtime with per-request V8 isolates, Ed25519 signed audit chains, and sub-40ms cold starts optimized for native MCP execution. See our infrastructure

About Cortex XSIAM MCP Server

Connect Cortex XSIAM to any AI agent via MCP.

How to Connect Cortex XSIAM to Vercel AI SDK via MCP

Follow these steps to integrate the Cortex XSIAM MCP Server with Vercel AI SDK.

01

Install dependencies

Run npm install @ai-sdk/mcp ai @ai-sdk/openai

02

Replace the token

Replace [YOUR_TOKEN_HERE] with your Vinkius token

03

Run the script

Save to agent.ts and run with npx tsx agent.ts

04

Explore tools

The SDK discovers 9 tools from Cortex XSIAM and passes them to the LLM

Why Use Vercel AI SDK with the Cortex XSIAM MCP Server

Vercel AI SDK provides unique advantages when paired with Cortex XSIAM through the Model Context Protocol.

01

TypeScript-first: every MCP tool gets full type inference, IDE autocomplete, and compile-time error checking out of the box

02

Framework-agnostic core works with Next.js, Nuxt, SvelteKit, or any Node.js runtime. same Cortex XSIAM integration everywhere

03

Built-in streaming UI primitives let you display Cortex XSIAM tool results progressively in React, Svelte, or Vue components

04

Edge-compatible: the AI SDK runs on Vercel Edge Functions, Cloudflare Workers, and other edge runtimes for minimal latency

Cortex XSIAM + Vercel AI SDK Use Cases

Practical scenarios where Vercel AI SDK combined with the Cortex XSIAM MCP Server delivers measurable value.

01

AI-powered web apps: build dashboards that query Cortex XSIAM in real-time and stream results to the UI with zero loading states

02

API backends: create serverless endpoints that orchestrate Cortex XSIAM tools and return structured JSON responses to any frontend

03

Chatbots with tool use: embed Cortex XSIAM capabilities into conversational interfaces with streaming responses and tool call visibility

04

Internal tools: build admin panels where team members interact with Cortex XSIAM through natural language queries

Cortex XSIAM MCP Tools for Vercel AI SDK (9)

These 9 tools become available when you connect Cortex XSIAM to Vercel AI SDK via MCP:

01

execute_playbook

g., enrich IOCs, block IP, reset password). Requires playbook name and optional input arguments. Use this to speed up response times and ensure consistent handling of incidents. Execute an automated incident response playbook in Cortex XSIAM

02

get_alerts

Use this to review detection rules firing or analyze threat patterns. List security alerts detected by Cortex XSIAM

03

get_endpoints

Use this to audit endpoint coverage, identify disconnected hosts, or target remediation actions. List managed endpoints (hosts/devices) in Cortex XSIAM

04

get_incident_details

Requires the incident ID. Use this for deep investigation or context before taking action. Get detailed information about a specific security incident

05

get_incidents

Use this to monitor SOC queue, identify high-severity incidents, or track analyst workload. Supports sorting and limiting results. List security incidents in Cortex XSIAM

06

get_indicators

Use this to review threat intelligence or check if specific artifacts are known malicious. List indicators of compromise (IOCs) tracked in Cortex XSIAM

07

isolate_endpoint

Requires the endpoint ID. Use this immediately upon confirming a severe compromise to prevent lateral movement. Isolate a compromised endpoint from the network

08

run_xql_query

XQL allows searching logs, endpoints, network data, and more. Requires a valid XQL query string. Returns the results of the query. Use this for custom threat hunting, compliance reporting, or data analysis. Execute an XQL (Cortex Query Language) query for advanced threat hunting

09

scan_endpoint

Supports "quick" or "deep" scan types. Requires the endpoint ID. Use this to verify if a host is infected or after cleaning a threat. Trigger a malware scan on a specific endpoint

Troubleshooting Cortex XSIAM MCP Server with Vercel AI SDK

Common issues when connecting Cortex XSIAM to Vercel AI SDK through the Vinkius, and how to resolve them.

01

createMCPClient is not a function

Install: npm install @ai-sdk/mcp

Cortex XSIAM + Vercel AI SDK FAQ

Common questions about integrating Cortex XSIAM MCP Server with Vercel AI SDK.

01

How does the Vercel AI SDK connect to MCP servers?

Import createMCPClient from @ai-sdk/mcp and pass the server URL. The SDK discovers all tools and provides typed TypeScript interfaces for each one.
02

Can I use MCP tools in Edge Functions?

Yes. The AI SDK is fully edge-compatible. MCP connections work on Vercel Edge Functions, Cloudflare Workers, and similar runtimes.
03

Does it support streaming tool results?

Yes. The SDK provides streaming primitives like useChat and streamText that handle tool calls and display results progressively in the UI.

Connect Cortex XSIAM to Vercel AI SDK

Get your token, paste the configuration, and start using 9 tools in under 2 minutes. No API key management needed.