MCP Directory
Back

omega-memory

by omega-memory · Python · ★ 109

Persistent memory for AI coding agents

#ai-agent#ai-memory#claude#claude-code#coding-agent#context-engineering#cursor#knowledge-graph#llm#local-first#mcp#mcp-server#memory#model-context-protocol#persistent-memory#semantic-search#windsurf

Install

pip install git+https://github.com/omega-memory/omega-memory.git

Claude Desktop config

Add this to your claude_desktop_config.json:

{
  "mcpServers": {
    "omega-memory": {
      "command": "uvx",
      "args": [
        "git+https://github.com/omega-memory/omega-memory.git"
      ]
    }
  }
}

From the README

**Cross-model memory for AI agents. Local-first. Works with Claude, GPT, Gemini, Cursor, Claw Code, and any MCP client.** Your agent's brain shouldn't live on someone else's server, or be locked to one provider. [](https://www.python.org/downloads/) [](https://pypi.org/project/omega-memory/) [](LICENSE) []() ------|:-----:|:----------------:|:----:|:---:| | Works with any LLM/agent | **Yes** | Claude only | Yes | Yes | | Your data stays on your machine | **Yes** | Partial* | No | No | | No cloud dependency | **Yes** | No (needs API) | No | No | | Semantic search + knowledge graph | **Yes** |…
Read full README on GitHub →

💡 Need a managed MCP host?

Try Claude Pro for the smoothest MCP experience, or browse our cloud-hosted servers.

Related ai & ml servers