MCP Directory
Back

ollama-mcp

by rawveg · TypeScript · ★ 155

An MCP Server for Ollama

#claude#claude-code#cline#cursor#llm#mcp#mcp-server#mcp-servers#mcp-tools#ollama#ollama-api#ollama-client#ollama-cloud#open-source#windsurf#windsurf-ai

Install

npx -y github:rawveg/ollama-mcp

Claude Desktop config

Add this to your claude_desktop_config.json:

{
  "mcpServers": {
    "ollama-mcp": {
      "command": "npx",
      "args": [
        "-y",
        "github:rawveg/ollama-mcp"
      ]
    }
  }
}

From the README

**Supercharge your AI assistant with local LLM access** [](https://www.gnu.org/licenses/agpl-3.0) [](https://www.typescriptlang.org/) [](https://github.com/anthropics/model-context-protocol) [](https://github.com/rawveg/ollama-mcp) An MCP (Model Context Protocol) server that exposes the complete Ollama SDK as MCP tools, enabling seamless integration between your local LLM models and MCP-compatible applications like Claude Desktop and Cline. [Features](#-features) • [Installation](#-installation) • [Available Tools](#-available-tools) • [Configuration](#-configuration) • [Retry Behavior](#-r…
Read full README on GitHub →

💡 Need a managed MCP host?

Try Claude Pro for the smoothest MCP experience, or browse our cloud-hosted servers.

Related developer tools servers