MCP Directory
Back

LLaMa-MCP-Streamlit

by Nikunj2003 · Python · ★ 43

AI assistant built with Streamlit, NVIDIA NIM (LLaMa 3.3:70B) / Ollama, and Model Control Protocol (MCP).

#llama#llm#mcp#mcp-client#mcp-llama#mcp-server#model-context-protocol#nvidia-nim-api#ollama#python#streamlit

Install

pip install git+https://github.com/Nikunj2003/LLaMa-MCP-Streamlit.git

Claude Desktop config

Add this to your claude_desktop_config.json:

{
  "mcpServers": {
    "llama-mcp-streamlit": {
      "command": "uvx",
      "args": [
        "git+https://github.com/Nikunj2003/LLaMa-MCP-Streamlit.git"
      ]
    }
  }
}

From the README

This project is an interactive AI assistant built with **Streamlit**, **NVIDIA NIM's API (LLaMa 3.3:70b)/Ollama**, and **Model Control Protocol (MCP)**. It provides a conversational interface where you can interact with an LLM to execute real-time external tools via MCP, retrieve data, and perform actions seamlessly. The assistant supports: - **Custom model selection** (NVIDIA NIM / Ollama) - **API configuration** for different backends - **Tool integration via MCP** to enhance usability and real-time data processing - **A user-friendly chat-based experience** with Streamlit Before running…
Read full README on GitHub →

💡 Need a managed MCP host?

Try Claude Pro for the smoothest MCP experience, or browse our cloud-hosted servers.

Related ai & ml servers