mcp
Facilitates executing system commands and retrieving web data using the Brave Search API by interpreting user intents via a Large Language Model (LLM).
Facilitates executing system commands and retrieving web data using the Brave Search API by interpreting user intents via a Large Language Model (LLM).
This project implements a client-server architecture using MCP (Model Context Protocol) to handle user prompts, determine their intent using a Large Language Model (LLM), and route them to the appropriate service for execution. The system consists of two main components:
The LLM determines whether the user request requires command execution or web search. If the prompt is unclear, the LLM asks follow-up questions before generating a structured JSON response specifying the tool name (command_execution
or fetch_web_data
) and its required arguments.
command_execution
:subprocess
module.fetch_web_data
:git clone -https://github.com/mjunaid46/mcp
cd mcp
# Create a virtual environment
uv venv
# Activate virtual environment
# On Unix or MacOS:
source .venv/bin/activate
# On Windows:
.venvScriptsactivate
pip install -r requirements.txt
python -m ensurepip
Install the Ollama CLI tool by following the instructions at Ollama Installation Guide.
Then, check the Ollama:
ollama list
3. Specify the model in the client command (llama3 or llama2):
uv run client/client.py server/command_server.py server/web_server.py ollama llama3
.env
file to store Groq’s API Key: touch .env
.env
file:GROQ_API_KEY=<your_groq_api_key_here>
Add your Brave’s API key to the .env
file:
BRAVE_SEARCH_API_KEY=<your_brave_search_api_key_here>
uv run client/client.py server/command_server.py server/web_server.py ollama llama3
uv run client/client.py server/command_server.py server/web_server.py groq
Give query to client (e.g., touch test.txt
, create text file with test
, rm test.txt file
, etc.)
# Try the below prompts one by one to test.
What is the capital of Pakistan.
What is MCP?
Create a file in my present working directory
git clone https://github.com/mjunaid46/mcp/
cd mcp
Modify the config.ini file to specify the model type and name:
[settings]
model_type = ollama # Change to "groq" if using Groq
model_name = llama3 # Update model name if needed
docker-compose build
docker-compose run pull-model-client