Ollama

ollama

Run local LLMs with Ollama integration

Features

ailocalllm

Quick Install

npx -y @modelcontextprotocol/server-ollama

Configuration

Config file location:

~/Library/Application Support/Claude/claude_desktop_config.json
config.json
{
  "mcpServers": {
    "ollama": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-ollama"
      ]
    }
  }
}

Step 1: Copy the configuration above

Step 2: Open or create the config file at the path shown above

Step 3: Merge with existing config (add to mcpServers object) or replace if new

Step 4: Restart your Claude Desktop application

Details

Author
ollama
Type
Community