A Model Context Protocol (MCP) server that enables semantic search and retrieval of documentation using a vector database (Qdrant). This server allows you to add documentation from URLs or local files and then search through them using natural language queries.
Quick Install Guide
-
Install the package globally:
npm install -g @qpd-v/mcp-server-ragdocs
-
Start Qdrant (using Docker):
docker run -p 6333:6333 -p 6334:6334 qdrant/qdrant
-
Ensure Ollama is running with the default embedding model:
ollama pull nomic-embed-text
-
Add to your configuration file:
- For Cline:
%AppData%RoamingCodeUserglobalStoragesaoudrizwan.claude-devsettingscline_mcp_settings.json
- For Roo-Code:
%AppData%RoamingCodeUserglobalStorage
ooveterinaryinc.roo-clinesettingscline_mcp_settings.json
- For Claude Desktop:
%AppData%Claudeclaude_desktop_config.json
{
"mcpServers": {
"ragdocs": {
"command": "node",
"args": ["C:/Users/YOUR_USERNAME/AppData/Roaming/npm/node_modules/@qpd-v/mcp-server-ragdocs/build/index.js"],
"env": {
"QDRANT_URL": "http://127.0.0.1:6333",
"EMBEDDING_PROVIDER": "ollama",
"OLLAMA_URL": "http://localhost:11434"
}
}
}
}
- Verify installation:
# Check Qdrant is running
curl http://localhost:6333/collections
# Check Ollama has the model
ollama list | grep nomic-embed-text
Version
Current version: 0.1.6
Features
- Add documentation from URLs or local files
- Store documentation in a vector database for semantic search
- Search through documentation using natural language
- List all documentation sources
Installation
Install globally using npm:
npm install -g @qpd-v/mcp-server-ragdocs
This will install the server in your global npm directory, which you'll need for the configuration steps below.
Requirements
- Node.js 16 or higher
- Qdrant (either local or cloud)
- One of the following for embeddings:
- Ollama running locally (default, free)
- OpenAI API key (optional, paid)
Qdrant Setup Options
Option 1: Local Qdrant
-
Using Docker (recommended):
docker run -p 6333:6333 -p 6334:6334 qdrant/qdrant
-
Or download from Qdrant's website
Option 2: Qdrant Cloud
- Create an account at Qdrant Cloud
- Create a new cluster
- Get your cluster URL and API key from the dashboard
- Use these in your configuration (see Configuration section below)
Configuration
The server can be used with both Cline/Roo and Claude Desktop. Configuration differs slightly between them:
Cline Configuration
Add to your Cline settings file (%AppData%RoamingCodeUserglobalStoragesaoudrizwan.claude-devsettingscline_mcp_settings.json
)
AND/OR
Add to your Roo-Code settings file (%AppData%RoamingCodeUserglobalStorage
ooveterinaryinc.roo-clinesettingscline_mcp_settings.json
):
- Using npm global install (recommended):
{
"mcpServers": {
"ragdocs": {
"command": "node",
"args": ["C:/Users/YOUR_USERNAME/AppData/Roaming/npm/node_modules/@qpd-v/mcp-server-ragdocs/build/index.js"],
"env": {
"QDRANT_URL": "http://127.0.0.1:6333",
"EMBEDDING_PROVIDER": "ollama",
"OLLAMA_URL": "http://localhost:11434"
}
}
}
}
For OpenAI instead of Ollama:
{
"mcpServers": {
"ragdocs": {
"command": "node",
"args": ["C:/Users/YOUR_USERNAME/AppData/Roaming/npm/node_modules/@qpd-v/mcp-server-ragdocs/build/index.js"],
"env": {
"QDRANT_URL": "http://127.0.0.1:6333",
"EMBEDDING_PROVIDER": "openai",
"OPENAI_API_KEY": "your-openai-api-key"
}
}
}
}
- Using local development setup:
{
"mcpServers": {
"ragdocs": {
"command": "node",
"args": ["PATH_TO_PROJECT/mcp-ragdocs/build/index.js"],
"env": {
"QDRANT_URL": "http://127.0.0.1:6333",
"EMBEDDING_PROVIDER": "ollama",
"OLLAMA_URL": "http://localhost:11434"
}
}
}
}
Claude Desktop Configuration
Add to your Claude Desktop config file:
- Windows: %AppData%Claudeclaude_desktop_config.json
- macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
- Windows Setup with Ollama (using full paths):
{
"mcpServers": {
"ragdocs": {
"command": "C:Program Filesnodejsnode.exe",
"args": [
"C:UsersYOUR_USERNAMEAppDataRoamingnpmnode_modules@qpd-v/mcp-server-ragdocsbuildindex.js"
],
"env": {
"QDRANT_URL": "http://127.0.0.1:6333",
"EMBEDDING_PROVIDER": "ollama",
"OLLAMA_URL": "http://localhost:11434"
}
}
}
}
Windows Setup with OpenAI:
{
"mcpServers": {
"ragdocs": {
"command": "C:Program Filesnodejsnode.exe",
"args": [
"C:UsersYOUR_USERNAMEAppDataRoamingnpmnode_modules@qpd-v/mcp-server-ragdocsbuildindex.js"
],
"env": {
"QDRANT_URL": "http://127.0.0.1:6333",
"EMBEDDING_PROVIDER": "openai",
"OPENAI_API_KEY": "your-openai-api-key"
}
}
}
}
- macOS Setup with Ollama:
{
"mcpServers": {
"ragdocs": {
"command": "/usr/local/bin/node",
"args": [
"/usr/local/lib/node_modules/@qpd-v/mcp-server-ragdocs/build/index.js"
],
"env": {
"QDRANT_URL": "http://127.0.0.1:6333",
"EMBEDDING_PROVIDER": "ollama",
"OLLAMA_URL": "http://localhost:11434"
}
}
}
}
Qdrant Cloud Configuration
For either Cline or Claude Desktop, when using Qdrant Cloud, modify the env section:
With Ollama:
{
"env": {
"QDRANT_URL": "https://your-cluster-url.qdrant.tech",
"QDRANT_API_KEY": "your-qdrant-api-key",
"EMBEDDING_PROVIDER": "ollama",
"OLLAMA_URL": "http://localhost:11434"
}
}
With OpenAI:
{
"env": {
"QDRANT_URL": "https://your-cluster-url.qdrant.tech",
"QDRANT_API_KEY": "your-qdrant-api-key",
"EMBEDDING_PROVIDER": "openai",
"OPENAI_API_KEY": "your-openai-api-key"
}
}
Environment Variables
Qdrant Configuration
QDRANT_URL
(required): URL of your Qdrant instance
- For local: http://localhost:6333
- For cloud: https://your-cluster-url.qdrant.tech
QDRANT_API_KEY
(required for cloud): Your Qdrant Cloud API key
Embeddings Configuration
EMBEDDING_PROVIDER
(optional): Choose between 'ollama' (default) or 'openai'
EMBEDDING_MODEL
(optional):
- For Ollama: defaults to 'nomic-embed-text'
- For OpenAI: defaults to 'text-embedding-3-small'
OLLAMA_URL
(optional): URL of your Ollama instance (defaults to http://localhost:11434)
OPENAI_API_KEY
(required if using OpenAI): Your OpenAI API key
add_documentation
- Add documentation from a URL to the RAG database
-
Parameters:
url
: URL of the documentation to fetch
-
search_documentation
- Search through stored documentation
-
Parameters:
query
: Search query
limit
(optional): Maximum number of results to return (default: 5)
-
list_sources
- List all documentation sources currently stored
- No parameters required
Example Usage
In Claude Desktop or any other MCP-compatible client:
-
Add documentation:
Add this documentation: https://docs.example.com/api
-
Search documentation:
Search the documentation for information about authentication
-
List sources:
What documentation sources are available?
Development
-
Clone the repository:
git clone https://github.com/qpd-v/mcp-server-ragdocs.git
cd mcp-server-ragdocs
-
Install dependencies:
npm install
-
Build the project:
npm run build
-
Run locally:
npm start
License
MIT
Troubleshooting
Common Issues
- Qdrant Connection Error
Error: Failed to connect to Qdrant at http://localhost:6333
- Check if Docker is running
- Verify Qdrant container is running:
docker ps | grep qdrant
-
Try restarting the container
-
Ollama Model Missing
Error: Model nomic-embed-text not found
- Run:
ollama pull nomic-embed-text
-
Verify model is installed: ollama list
-
Configuration Path Issues
- Windows: Replace
YOUR_USERNAME
with your actual Windows username
- Check file permissions
-
Verify the paths exist
-
npm Global Install Issues
- Try installing with admin privileges
- Check npm is in PATH:
npm -v
- Verify global installation:
npm list -g @qpd-v/mcp-server-ragdocs
For other issues, please check:
- Docker logs: docker logs $(docker ps -q --filter ancestor=qdrant/qdrant)
- Ollama status: ollama list
- Node.js version: node -v
(should be 16 or higher)
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.