MCP Directory
Back

mcp-client-for-ollama

by jonigl · Python · ★ 678

A text-based user interface (TUI) client for interacting with MCP servers using Ollama. Features include agent mode, multi-server, model switching, streaming responses, tool management, human-in-the-loop, thinking mode, model params config, MCP prompts, custom system prompt and saved preferences. Built for developers working with local LLMs.

#agentic-ai#ai#command-line-tool#generative-ai#linux#llm#local-llm#macos#mcp#mcp-client#mcp-server#model-context-protocol#ollama#open-source#pypi-package#sse#stdio#streamable-http#tool-management#windows

Install

pip install git+https://github.com/jonigl/mcp-client-for-ollama.git

Claude Desktop config

Add this to your claude_desktop_config.json:

{
  "mcpServers": {
    "mcp-client-for-ollama": {
      "command": "uvx",
      "args": [
        "git+https://github.com/jonigl/mcp-client-for-ollama.git"
      ]
    }
  }
}

From the README

A simple yet powerful Python client for interacting with Model Context Protocol (MCP) servers using Ollama, allowing local LLMs to use tools. ---------------|------------------|-----------------------------------------------------| | | | While model is generating, abort the current response generation | | | | Clear conversation history and context | | | | Clear the terminal screen | | | | Toggle context retention | | | | Display context…
Read full README on GitHub →

💡 Need a managed MCP host?

Try Claude Pro for the smoothest MCP experience, or browse our cloud-hosted servers.

Related developer tools servers