Lamha MCP Server for Vercel AI SDKGive Vercel AI SDK instant access to 8 tools to Cancel Order, Check City Coverage, Create Order, and more
The Vercel AI SDK is the TypeScript toolkit for building AI-powered applications. Connect Lamha through Vinkius and every tool is available as a typed function. ready for React Server Components, API routes, or any Node.js backend.
Ask AI about this App Connector for Vercel AI SDK
The Lamha app connector for Vercel AI SDK is a standout in the Productivity category — giving your AI agent 8 tools to work with, ready to go from day one.
Vinkius delivers Streamable HTTP and SSE to any MCP client
import { createMCPClient } from "@ai-sdk/mcp";
import { generateText } from "ai";
import { openai } from "@ai-sdk/openai";
async function main() {
const mcpClient = await createMCPClient({
transport: {
type: "http",
// Your Vinkius token. get it at cloud.vinkius.com
url: "https://edge.vinkius.com/[YOUR_TOKEN_HERE]/mcp",
},
});
try {
const tools = await mcpClient.tools();
const { text } = await generateText({
model: openai("gpt-4o"),
tools,
prompt: "Using Lamha, list all available capabilities.",
});
console.log(text);
} finally {
await mcpClient.close();
}
}
main();
* Every MCP server runs on Vinkius-managed infrastructure inside AWS - a purpose-built runtime with per-request V8 isolates, Ed25519 signed audit chains, and sub-40ms cold starts optimized for native MCP execution. See our infrastructure
About Lamha MCP Server
Connect your Lamha account to any AI agent and manage HR operations through natural conversation.
The Vercel AI SDK gives every Lamha tool full TypeScript type inference, IDE autocomplete, and compile-time error checking. Connect 8 tools through Vinkius and stream results progressively to React, Svelte, or Vue components. works on Edge Functions, Cloudflare Workers, and any Node.js runtime.
What you can do
- Employee Management — List employees, inspect profiles, and track status
- Attendance Tracking — Monitor check-in/out times and attendance records
- Department Browsing — Navigate organizational structure and departments
- Leave Management — Track leave requests, balances, and approvals
- Payroll Access — View payroll data and compensation details
The Lamha MCP Server exposes 8 tools through the Vinkius. Connect it to Vercel AI SDK in under two minutes — no API keys to rotate, no infrastructure to provision, no vendor lock-in. Your configuration, your data, your control.
All 8 Lamha tools available for Vercel AI SDK
When Vercel AI SDK connects to Lamha through Vinkius, your AI agent gets direct access to every tool listed below — spanning attendance-tracking, leave-management, payroll-management, and more. Every call is secured with network, filesystem, subprocess, and code evaluation entitlements inside a sandboxed runtime. Beyond a simple connection, you get a full AI Gateway with real-time visibility into agent activity, enterprise governance, and optimized token usage.
Cancel an existing order
Check delivery coverage for a city
Create a new logistics order
Get details for a specific order
List delivery carriers
List product inventory
List Lamha orders
List warehouses
Connect Lamha to Vercel AI SDK via MCP
Follow these steps to wire Lamha into Vercel AI SDK. The entire setup takes under two minutes — your credentials stay safe behind the Vinkius.
Install dependencies
npm install @ai-sdk/mcp ai @ai-sdk/openaiReplace the token
[YOUR_TOKEN_HERE] with your Vinkius tokenRun the script
agent.ts and run with npx tsx agent.tsExplore tools
Why Use Vercel AI SDK with the Lamha MCP Server
Vercel AI SDK provides unique advantages when paired with Lamha through the Model Context Protocol.
TypeScript-first: every MCP tool gets full type inference, IDE autocomplete, and compile-time error checking out of the box
Framework-agnostic core works with Next.js, Nuxt, SvelteKit, or any Node.js runtime. same Lamha integration everywhere
Built-in streaming UI primitives let you display Lamha tool results progressively in React, Svelte, or Vue components
Edge-compatible: the AI SDK runs on Vercel Edge Functions, Cloudflare Workers, and other edge runtimes for minimal latency
Lamha + Vercel AI SDK Use Cases
Practical scenarios where Vercel AI SDK combined with the Lamha MCP Server delivers measurable value.
AI-powered web apps: build dashboards that query Lamha in real-time and stream results to the UI with zero loading states
API backends: create serverless endpoints that orchestrate Lamha tools and return structured JSON responses to any frontend
Chatbots with tool use: embed Lamha capabilities into conversational interfaces with streaming responses and tool call visibility
Internal tools: build admin panels where team members interact with Lamha through natural language queries
Example Prompts for Lamha in Vercel AI SDK
Ready-to-use prompts you can give your Vercel AI SDK agent to start working with Lamha immediately.
"Show all departments and today's attendance."
"Show pending leave requests and employee leave balances."
"Show payroll summary and employee details for the Engineering team."
Troubleshooting Lamha MCP Server with Vercel AI SDK
Common issues when connecting Lamha to Vercel AI SDK through the Vinkius, and how to resolve them.
createMCPClient is not a function
npm install @ai-sdk/mcpLamha + Vercel AI SDK FAQ
Common questions about integrating Lamha MCP Server with Vercel AI SDK.
How does the Vercel AI SDK connect to MCP servers?
createMCPClient from @ai-sdk/mcp and pass the server URL. The SDK discovers all tools and provides typed TypeScript interfaces for each one.Can I use MCP tools in Edge Functions?
Does it support streaming tool results?
useChat and streamText that handle tool calls and display results progressively in the UI.