3,400+ MCP servers ready to use
Vinkius

Bring Llm Observability
to Vercel AI SDK

Learn how to connect Keywords AI to Vercel AI SDK and start using 11 AI agent tools in minutes. Fully managed, enterprise secure, and ready to use without writing a single line of code.

Check Keywordsai StatusGet AnalyticsGet CreditsGet RequestGet Usage StatsGet UserList AlertsList ModelsList RequestsList Requests By ModelList Users

What is the Keywords AI MCP Server?

Connect your Keywords AI account to any AI agent and monitor LLM performance.

What you can do

  • Request Logs — List and filter all LLM API calls by model
  • Cost Tracking — Monitor credit balance and usage statistics
  • Analytics — View cost trends, latency metrics, and error rates
  • Model Catalog — Browse available LLM models
  • Team Management — List users and view activity
  • Alerts — Review monitoring thresholds

Built-in capabilities (11)

check_keywordsai_status

Verify API connectivity

get_analytics

Get analytics dashboard

get_credits

Get credit balance

get_request

Get request details

get_usage_stats

Get usage statistics

get_user

Get user details

list_alerts

List monitoring alerts

list_models

List available models

list_requests

List API request logs

list_requests_by_model

List requests by model

list_users

List team users

Why Vercel AI SDK?

The Vercel AI SDK gives every Keywords AI tool full TypeScript type inference, IDE autocomplete, and compile-time error checking. Connect 11 tools through Vinkius and stream results progressively to React, Svelte, or Vue components. works on Edge Functions, Cloudflare Workers, and any Node.js runtime.

  • TypeScript-first: every MCP tool gets full type inference, IDE autocomplete, and compile-time error checking out of the box

  • Framework-agnostic core works with Next.js, Nuxt, SvelteKit, or any Node.js runtime. same Keywords AI integration everywhere

  • Built-in streaming UI primitives let you display Keywords AI tool results progressively in React, Svelte, or Vue components

  • Edge-compatible: the AI SDK runs on Vercel Edge Functions, Cloudflare Workers, and other edge runtimes for minimal latency

See it in action

Keywords AI in Vercel AI SDK

AI AgentVinkius
High Security·Kill Switch·Plug and Play
Why Vinkius

Keywords AI and 3,400+ other MCP servers. One platform. One governance layer.

Teams that connect Keywords AI to Vercel AI SDK through Vinkius don't need to source, host, or maintain individual MCP servers. Every tool call runs inside a hardened runtime with credential isolation, DLP, and a signed audit chain.

3,400+MCP Servers ready
<40msCold start
60%Token savings
Raw MCP
Vinkius
Server catalogFind and host yourself3,400+ managed
InfrastructureSelf-hostedSandboxed V8 isolates
Credential handlingPlaintext in configVault + runtime injection
Data loss preventionNoneConfigurable DLP policies
Kill switchNoneGlobal instant shutdown
Financial circuit breakersNonePer-server limits + alerts
Audit trailNoneEd25519 signed logs
SIEM log streamingNoneSplunk, Datadog, Webhook
HoneytokensNoneCanary alerts on leak
Custom domainsNot applicableDNS challenge verified
GDPR complianceManual effortAutomated purge + export
Enterprise Security

Why teams choose Vinkius for Keywords AI in Vercel AI SDK

The Keywords AI MCP Server runs on Vinkius-managed infrastructure inside AWS — a purpose-built runtime with per-request V8 isolates, Ed25519 signed audit chains, and sub-40ms cold starts. All 11 tools execute in hardened sandboxes optimized for native MCP execution.

Your AI agents in Vercel AI SDK only access the data you authorize, with DLP that blocks sensitive information from ever reaching the model, kill switch for instant shutdown, and up to 60% token savings. Enterprise-grade infrastructure, zero maintenance.

Keywords AI
Fully ManagedVinkius Servers
60%Token savings
High SecurityEnterprise-grade
IAMAccess control
EU AI ActCompliant
DLPData protection
V8 IsolateSandboxed
Ed25519Audit chain
<40msKill switch
Stream every event to Splunk, Datadog, or your own webhook in real-time

* Every MCP server runs on Vinkius-managed infrastructure inside AWS - a purpose-built runtime with per-request V8 isolates, Ed25519 signed audit chains, and sub-40ms cold starts optimized for native MCP execution. See our infrastructure

The Vinkius Advantage

How Vinkius secures Keywords AI for Vercel AI SDK

Every tool call from Vercel AI SDK to the Keywords AI MCP Server is protected by DLP redaction, cryptographic audit chains, V8 sandbox isolation, kill switch, and financial circuit breakers.

< 40msCold start
Ed25519Signed audit chain
60%Token savings
FAQ

Frequently asked questions

01

Can my AI track LLM costs?

Yes. get_credits shows your balance, get_usage_stats breaks down costs by model and time period.

02

Can I filter request logs by model?

Yes. list_requests_by_model returns only requests made to a specific LLM.

03

What analytics are available?

get_analytics provides cost trends, latency percentiles, error rates, and token usage over time.

04

How does the Vercel AI SDK connect to MCP servers?

Import createMCPClient from @ai-sdk/mcp and pass the server URL. The SDK discovers all tools and provides typed TypeScript interfaces for each one.

05

Can I use MCP tools in Edge Functions?

Yes. The AI SDK is fully edge-compatible. MCP connections work on Vercel Edge Functions, Cloudflare Workers, and similar runtimes.

06

Does it support streaming tool results?

Yes. The SDK provides streaming primitives like useChat and streamText that handle tool calls and display results progressively in the UI.

07

createMCPClient is not a function

Install: npm install @ai-sdk/mcp