MCP Directory
Back

llm-search

by snexus · Jupyter Notebook · ★ 652

Querying local documents, powered by LLM

#chatbot#chroma#hyde#langchain-python#large-language-models#llm#mcp#openai-chatgpt#rag#reranking#retrieval-augmented-generation#splade#streamlit

Install

git clone https://github.com/snexus/llm-search.git

Claude Desktop config

Add this to your claude_desktop_config.json:

{
  "mcpServers": {
    "llm-search": {
      "command": "npx",
      "args": [
        "-y",
        "github:snexus/llm-search"
      ]
    }
  }
}

From the README

[](https://githubtocolab.com/snexus/llm-search/blob/main/notebooks/llmsearch_google_colab_demo.ipynb) [Documentation](https://llm-search.readthedocs.io/en/latest/) The purpose of this package is to offer an advanced question-answering (RAG) system with a simple YAML-based configuration that enables interaction with a collection of local documents. Special attention is given to improvements in various components of the system **in addition to basic LLM-based RAGs** - better document parsing, hybrid search, HyDE, chat history, deep linking, re-ranking, the ability to customize embeddings, and…
Read full README on GitHub →

💡 Need a managed MCP host?

Try Claude Pro for the smoothest MCP experience, or browse our cloud-hosted servers.

Related databases servers