Bring Attendance Tracking
to Vercel AI SDK
Learn how to connect Lamha to Vercel AI SDK and start using 8 AI agent tools in minutes. Fully managed, enterprise secure, and ready to use without writing a single line of code.
What is the Lamha MCP Server?
Connect your Lamha account to any AI agent and manage HR operations through natural conversation.
What you can do
- Employee Management — List employees, inspect profiles, and track status
- Attendance Tracking — Monitor check-in/out times and attendance records
- Department Browsing — Navigate organizational structure and departments
- Leave Management — Track leave requests, balances, and approvals
- Payroll Access — View payroll data and compensation details
How it works
1. Subscribe to this server
2. Enter your Lamha API Token
3. Start managing HR from Claude, Cursor, or any MCP-compatible client
Who is this for?
- HR Teams — manage employee records and attendance
- Managers — track leave requests and team attendance
- Payroll — access compensation data and reports
Built-in capabilities (8)
Cancel an existing order
Check delivery coverage for a city
Create a new logistics order
Get details for a specific order
List delivery carriers
List product inventory
List Lamha orders
List warehouses
Why Vercel AI SDK?
The Vercel AI SDK gives every Lamha tool full TypeScript type inference, IDE autocomplete, and compile-time error checking. Connect 8 tools through Vinkius and stream results progressively to React, Svelte, or Vue components. works on Edge Functions, Cloudflare Workers, and any Node.js runtime.
- —
TypeScript-first: every MCP tool gets full type inference, IDE autocomplete, and compile-time error checking out of the box
- —
Framework-agnostic core works with Next.js, Nuxt, SvelteKit, or any Node.js runtime. same Lamha integration everywhere
- —
Built-in streaming UI primitives let you display Lamha tool results progressively in React, Svelte, or Vue components
- —
Edge-compatible: the AI SDK runs on Vercel Edge Functions, Cloudflare Workers, and other edge runtimes for minimal latency
Lamha in Vercel AI SDK
Lamha and 3,400+ other MCP servers. One platform. One governance layer.
Teams that connect Lamha to Vercel AI SDK through Vinkius don't need to source, host, or maintain individual MCP servers. Every tool call runs inside a hardened runtime with credential isolation, DLP, and a signed audit chain.
Raw MCP | Vinkius | |
|---|---|---|
| Server catalog | Find and host yourself | 3,400+ managed |
| Infrastructure | Self-hosted | Sandboxed V8 isolates |
| Credential handling | Plaintext in config | Vault + runtime injection |
| Data loss prevention | None | Configurable DLP policies |
| Kill switch | None | Global instant shutdown |
| Financial circuit breakers | None | Per-server limits + alerts |
| Audit trail | None | Ed25519 signed logs |
| SIEM log streaming | None | Splunk, Datadog, Webhook |
| Honeytokens | None | Canary alerts on leak |
| Custom domains | Not applicable | DNS challenge verified |
| GDPR compliance | Manual effort | Automated purge + export |
Why teams choose Vinkius for Lamha in Vercel AI SDK
The Lamha MCP Server runs on Vinkius-managed infrastructure inside AWS — a purpose-built runtime with per-request V8 isolates, Ed25519 signed audit chains, and sub-40ms cold starts. All 8 tools execute in hardened sandboxes optimized for native MCP execution.
Your AI agents in Vercel AI SDK only access the data you authorize, with DLP that blocks sensitive information from ever reaching the model, kill switch for instant shutdown, and up to 60% token savings. Enterprise-grade infrastructure, zero maintenance.

* Every MCP server runs on Vinkius-managed infrastructure inside AWS - a purpose-built runtime with per-request V8 isolates, Ed25519 signed audit chains, and sub-40ms cold starts optimized for native MCP execution. See our infrastructure
How Vinkius secures
Lamha for Vercel AI SDK
Every tool call from Vercel AI SDK to the Lamha MCP Server is protected by DLP redaction, cryptographic audit chains, V8 sandbox isolation, kill switch, and financial circuit breakers.
Frequently asked questions
Can I track employee attendance and leave?
Yes. Monitor check-in/out records, view attendance summaries, and track leave balances, requests, and approvals for any employee.
How does Lamha authentication work?
Lamha uses a Token header (not Bearer) for authentication against app.lamha.sa/api/v2. This is a custom token format.
Can I browse the organizational structure?
Yes. Navigate departments, teams, and reporting hierarchies within the organization.
How does the Vercel AI SDK connect to MCP servers?
Import createMCPClient from @ai-sdk/mcp and pass the server URL. The SDK discovers all tools and provides typed TypeScript interfaces for each one.
Can I use MCP tools in Edge Functions?
Yes. The AI SDK is fully edge-compatible. MCP connections work on Vercel Edge Functions, Cloudflare Workers, and similar runtimes.
Does it support streaming tool results?
Yes. The SDK provides streaming primitives like useChat and streamText that handle tool calls and display results progressively in the UI.
createMCPClient is not a function
Install: npm install @ai-sdk/mcp
