Groq
Freeby AI Labs
Ultra-fast LLM inference with LPU chips.
v1.0.0Added Jan 30, 2025
groqinferencefast
Groq MCP Server
Ultra-fast LLM inference with LPU chips.
Features
- Chat completions
- Low latency
- Llama models
- Mixtral
- Streaming
Installation
{
"mcpServers": {
"groq": {
"command": "npx",
"args": ["-y", "mcp-groq"]
}
}
}
Reviews
Installation
Quick install
npx -y mcp-groq
Add to claude_desktop_config.json
{
"mcpServers": {
"groq": {
"command": "npx",
"args": ["-y", "mcp-groq"]
}
}
}