MCP Directory
Back

headroom

by chopratejas · Python · ★ 1,615

The Context Optimization Layer for LLM Applications

#agent#ai#anthropic#compression#context-engineering#context-window#fastapi#langchain#llm#mcp#openai#proxy#python#rag#token-optimization

Install

pip install git+https://github.com/chopratejas/headroom.git

Claude Desktop config

Add this to your claude_desktop_config.json:

{
  "mcpServers": {
    "headroom": {
      "command": "uvx",
      "args": [
        "git+https://github.com/chopratejas/headroom.git"
      ]
    }
  }
}

From the README

**Compress everything your AI agent reads. Same answers, fraction of the tokens.** [](https://github.com/chopratejas/headroom/actions/workflows/ci.yml) [](https://app.codecov.io/gh/chopratejas/headroom) [](https://pypi.org/project/headroom-ai/) [](https://www.npmjs.com/package/headroom-ai) [](https://huggingface.co/chopratejas/kompress-base) [](https://headroomlabs.ai/dashboard) [](LICENSE) [](https://headroom-docs.vercel.app/docs) Works with Anthropic, OpenAI, Google, Bedrock, Vertex, Azure, OpenRouter, and 100+ models via LiteLLM. **Wrap your coding agent — one command:** **Drop it into…
Read full README on GitHub →

💡 Need a managed MCP host?

Try Claude Pro for the smoothest MCP experience, or browse our cloud-hosted servers.

Related developer tools servers