#llm-inference
llm-inference connectors — deploy one and your agent operates instantly.
3 apps
#llm-inference
3 apps
Groq
8 toolsEmpower LLM applications via Groq — perform ultra-fast LPU-accelerated chat completions, handle audio transcription and translation, and use JSON mode directly from any AI agent.

Anyscale
7 toolsOrchestrate your Anyscale infrastructure — manage LLM queries, vectors, services, and cluster batch jobs directly from your AI agent.

