Skip to content

Ollama

Free

by AI Labs

Local LLM management with model pull/push/list, multimodal support, and private inference.

v1.0.0Added Jan 18, 2025
local-llmprivacyinference
Works with:ClaudeGPTGeminiCopilot

Ollama MCP Server

Local LLM management with model pull/push/list, multimodal support, and private inference.

Features

  • Local LLMs
  • Model management
  • Multimodal
  • Privacy-first
  • Custom models

Installation

{
  "mcpServers": {
    "ollama-mcp": {
      "command": "npx",
      "args": ["-y", "ollama-mcp"]
    }
  }
}

Reviews

Leave a Review