MCP Directory
Back

dario

by askalf · JavaScript · ★ 121

Local LLM router. One endpoint for Claude Max/Pro, OpenAI, OpenRouter, Groq, Ollama, LiteLLM, any OpenAI-compat URL — your tools don't need to change. OAuth for Claude subscriptions, multi-account pool, MCP server. Zero runtime deps.

#ai#anthropic#api#claude#claude-max#claude-pro#cli#developer-tools#groq#litellm#llm#llm-router#multi-provider#oauth#ollama#openai#openai-compat#openrouter#proxy

Install

npx -y github:askalf/dario

Claude Desktop config

Add this to your claude_desktop_config.json:

{
  "mcpServers": {
    "dario": {
      "command": "npx",
      "args": [
        "-y",
        "github:askalf/dario"
      ]
    }
  }
}

From the README

dario A local LLM router. One endpoint, every provider.Runs on your machine. Unifies OpenAI, Groq, OpenRouter, Ollama, vLLM, LiteLLM, any OpenAI-compat URL, and your Claude Max subscription (via OAuth) behind one endpoint at http://localhost:3456. Speaks both the Anthropic Messages API and the OpenAI Chat Completions API, so your tools stop caring which vendor is upstream. Drops in under the Claude Agent SDK as an API-key-compatible backend. Zero runtime dependencies. SLSA-attested on every release. Nothing phones home. Independent, unofficial, third-party — see DISCLAIMER.md.…
Read full README on GitHub →

💡 Need a managed MCP host?

Try Claude Pro for the smoothest MCP experience, or browse our cloud-hosted servers.

Related ai & ml servers