ollama mcp
Enables seamless integration between Ollama's local LLM models and MCP-compatible applications, supporting model management and chat interactions.
Enables seamless integration between Ollama's local LLM models and MCP-compatible applications, supporting model management and chat interactions.
An MCP (Model Context Protocol) server for Ollama that enables seamless integration between Ollama's local LLM models and MCP-compatible applications like Claude Desktop.
To install Ollama MCP Server for Claude Desktop automatically via Smithery:
npx -y @smithery/cli install @rawveg/ollama-mcp --client claude
Install globally via npm:
npm install -g @rawveg/ollama-mcp
To install the Ollama MCP Server in other MCP-compatible applications (like Cline or Claude Desktop), add the following configuration to your application's MCP settings file:
{
"mcpServers": {
"@rawveg/ollama-mcp": {
"command": "npx",
"args": [
"-y",
"@rawveg/ollama-mcp"
]
}
}
}
The settings file location varies by application:
- Claude Desktop: claude_desktop_config.json
in the Claude app data directory
- Cline: cline_mcp_settings.json
in the VS Code global storage
Simply run:
ollama-mcp
The server will start on port 3456 by default. You can specify a different port using the PORT environment variable:
PORT=3457 ollama-mcp
PORT
: Server port (default: 3456). Can be used both when running directly and during Smithery installation:
# When running directly
PORT=3457 ollama-mcp
# When installing via Smithery
PORT=3457 npx -y @smithery/cli install @rawveg/ollama-mcp --client claude
OLLAMA_API
: Ollama API endpoint (default: http://localhost:11434)GET /models
- List available modelsPOST /models/pull
- Pull a new modelPOST /chat
- Chat with a modelGET /models/:name
- Get model detailsClone the repository:
git clone https://github.com/rawveg/ollama-mcp.git
cd ollama-mcp
Install dependencies:
npm install
Build the project:
npm run build
Start the server:
npm start
Contributions are welcome! Please feel free to submit a Pull Request.
MIT