databricks mcp server
A server that implements the Model Completion Protocol (MCP) to allow LLMs to interact with Databricks resources including clusters, jobs, notebooks, and SQL execution through natural language.
A server that implements the Model Completion Protocol (MCP) to allow LLMs to interact with Databricks resources including clusters, jobs, notebooks, and SQL execution through natural language.
A Model Completion Protocol (MCP) server for Databricks that provides access to Databricks functionality via the MCP protocol. This allows LLM-powered tools to interact with Databricks clusters, jobs, notebooks, and more.
The Databricks MCP Server exposes the following tools:
uv
package manager (recommended for MCP servers)uv
if you do not have it already:# MacOS/Linux
curl -LsSf https://astral.sh/uv/install.sh | sh
# Windows (in PowerShell)
irm https://astral.sh/uv/install.ps1 | iex
Restart your terminal after installation.
Clone the repository:
git clone https://github.com/JustTryAI/databricks-mcp-server.git
cd databricks-mcp-server
Set up the project with uv
:
# Create and activate virtual environment
uv venv
# On Windows
..venvScriptsactivate
# On Linux/Mac
source .venv/bin/activate
# Install dependencies in development mode
uv pip install -e .
# Install development dependencies
uv pip install -e ".[dev]"
Set up environment variables:
# Windows
set DATABRICKS_HOST=https://your-databricks-instance.azuredatabricks.net
set DATABRICKS_TOKEN=your-personal-access-token
# Linux/Mac
export DATABRICKS_HOST=https://your-databricks-instance.azuredatabricks.net
export DATABRICKS_TOKEN=your-personal-access-token
You can also create an .env
file based on the .env.example
template.
To start the MCP server, run:
# Windows
.start_mcp_server.ps1
# Linux/Mac
./start_mcp_server.sh
These wrapper scripts will execute the actual server scripts located in the scripts
directory. The server will start and be ready to accept MCP protocol connections.
You can also directly run the server scripts from the scripts directory:
# Windows
.scriptsstart_mcp_server.ps1
# Linux/Mac
./scripts/start_mcp_server.sh
The repository includes utility scripts to quickly view Databricks resources:
# View all clusters
uv run scripts/show_clusters.py
# View all notebooks
uv run scripts/show_notebooks.py
databricks-mcp-server/
├── src/ # Source code
│ ├── __init__.py # Makes src a package
│ ├── __main__.py # Main entry point for the package
│ ├── main.py # Entry point for the MCP server
│ ├── api/ # Databricks API clients
│ ├── core/ # Core functionality
│ ├── server/ # Server implementation
│ │ ├── databricks_mcp_server.py # Main MCP server
│ │ └── app.py # FastAPI app for tests
│ └── cli/ # Command-line interface
├── tests/ # Test directory
├── scripts/ # Helper scripts
│ ├── start_mcp_server.ps1 # Server startup script (Windows)
│ ├── run_tests.ps1 # Test runner script
│ ├── show_clusters.py # Script to show clusters
│ └── show_notebooks.py # Script to show notebooks
├── examples/ # Example usage
├── docs/ # Documentation
└── pyproject.toml # Project configuration
See project_structure.md
for a more detailed view of the project structure.
The project uses the following linting tools:
# Run all linters
uv run pylint src/ tests/
uv run flake8 src/ tests/
uv run mypy src/
The project uses pytest for testing. To run the tests:
# Run all tests with our convenient script
.scripts
un_tests.ps1
# Run with coverage report
.scripts
un_tests.ps1 -Coverage
# Run specific tests with verbose output
.scripts
un_tests.ps1 -Verbose -Coverage tests/test_clusters.py
You can also run the tests directly with pytest:
# Run all tests
uv run pytest tests/
# Run with coverage report
uv run pytest --cov=src tests/ --cov-report=term-missing
A minimum code coverage of 80% is the goal for the project.
docs/api
directoryexamples/
directory for usage examplesCheck the examples/
directory for usage examples. To run examples:
# Run example scripts with uv
uv run examples/direct_usage.py
uv run examples/mcp_client_usage.py
Contributions are welcome! Please feel free to submit a Pull Request.
This project is licensed under the MIT License - see the LICENSE file for details.