supabase mcp server

Local 2025-08-31 23:23:34 0

This server enables interaction with Supabase PostgreSQL databases through the MCP protocol, allowing seamless integration with Cursor and Windsurf IDEs for secure and validated database management.


Supabase    MCP

Enable your favorite IDE to safely execute SQL queries, manage your database end-to-end, access Management API, and handle user authentication with built-in safety controls.

Control Supabase with natural language

PyPI version CI Status Code Coverage Python 3.12+ uv package manager PyPI Downloads Smithery.ai Downloads MCP Server License

? The Future of Supabase MCP Server -> Query MCP

I'm thrilled to announce that Supabase MCP Server is evolving into thequery.dev!

While I have big plans for the future, I want to make these commitments super clear: - The core tool will stay free forever - free & open-source software is how I got into coding - Premium features will be added on top - enhancing capabilities without limiting existing functionality - First 2,000 early adopters will get special perks - join early for an exclusive treat!

? BIG v4 Launch Coming Soon!

? Join Early Access at thequery.dev

Table of contents

Getting startedFeature overviewTroubleshootingChangelog

✨ Key features

  • ? Compatible with Cursor, Windsurf, Cline and other MCP clients supporting stdio protocol
  • ? Control read-only and read-write modes of SQL query execution
  • ? Runtime SQL query validation with risk level assessment
  • ?️ Three-tier safety system for SQL operations: safe, write, and destructive
  • ? Robust transaction handling for both direct and pooled database connections
  • ? Automatic versioning of database schema changes
  • ? Manage your Supabase projects with Supabase Management API
  • ?‍? Manage users with Supabase Auth Admin methods via Python SDK
  • ? Pre-built tools to help Cursor & Windsurf work with MCP more effectively
  • ? Dead-simple install & setup via package manager (uv, pipx, etc.)

Getting Started

Prerequisites

Installing the server requires the following on your system: - Python 3.12+

If you plan to install via uv, ensure it's installed.

PostgreSQL Installation

PostgreSQL installation is no longer required for the MCP server itself, as it now uses asyncpg which doesn't depend on PostgreSQL development libraries.

However, you'll still need PostgreSQL if you're running a local Supabase instance:

MacOS

brew install postgresql@16

Windows - Download and install PostgreSQL 16+ from https://www.postgresql.org/download/windows/ - Ensure "PostgreSQL Server" and "Command Line Tools" are selected during installation

Step 1. Installation

Since v0.2.0 I introduced support for package installation. You can use your favorite Python package manager to install the server via:

# if pipx is installed (recommended)
pipx install supabase-mcp-server

# if uv is installed
uv pip install supabase-mcp-server

pipx is recommended because it creates isolated environments for each package.

You can also install the server manually by cloning the repository and running pipx install -e . from the root directory.

Installing from source

If you would like to install from source, for example for local development:

uv venv
# On Mac
source .venv/bin/activate
# On Windows
.venvScriptsactivate
# Install package in editable mode
uv pip install -e .

Installing via Smithery.ai

You can find the full instructions on how to use Smithery.ai to connect to this MCP server here.

Step 2. Configuration

The Supabase MCP server requires configuration to connect to your Supabase database, access the Management API, and use the Auth Admin SDK. This section explains all available configuration options and how to set them up.

Environment Variables

The server uses the following environment variables:

Variable Required Default Description
SUPABASE_PROJECT_REF Yes 127.0.0.1:54322 Your Supabase project reference ID (or local host:port)
SUPABASE_DB_PASSWORD Yes postgres Your database password
SUPABASE_REGION Yes* us-east-1 AWS region where your Supabase project is hosted
SUPABASE_ACCESS_TOKEN No None Personal access token for Supabase Management API
SUPABASE_SERVICE_ROLE_KEY No None Service role key for Auth Admin SDK

Note: The default values are configured for local Supabase development. For remote Supabase projects, you must provide your own values for SUPABASE_PROJECT_REF and SUPABASE_DB_PASSWORD.

? CRITICAL CONFIGURATION NOTE: For remote Supabase projects, you MUST specify the correct region where your project is hosted using SUPABASE_REGION. If you encounter a "Tenant or user not found" error, this is almost certainly because your region setting doesn't match your project's actual region. You can find your project's region in the Supabase dashboard under Project Settings.

Connection Types

Database Connection
  • The server connects to your Supabase PostgreSQL database using the transaction pooler endpoint
  • Local development uses a direct connection to 127.0.0.1:54322
  • Remote projects use the format: postgresql://postgres.[project_ref]:[password]@aws-0-[region].pooler.supabase.com:6543/postgres

⚠️ Important: Session pooling connections are not supported. The server exclusively uses transaction pooling for better compatibility with the MCP server architecture.

Management API Connection
  • Requires SUPABASE_ACCESS_TOKEN to be set
  • Connects to the Supabase Management API at https://api.supabase.com
  • Only works with remote Supabase projects (not local development)
Auth Admin SDK Connection
  • Requires SUPABASE_SERVICE_ROLE_KEY to be set
  • For local development, connects to http://127.0.0.1:54321
  • For remote projects, connects to https://[project_ref].supabase.co

Configuration Methods

The server looks for configuration in this order (highest to lowest priority):

  1. Environment Variables: Values set directly in your environment
  2. Local .env File: A .env file in your current working directory (only works when running from source)
  3. Global Config File:
  4. Windows: %APPDATA%supabase-mcp.env
  5. macOS/Linux: ~/.config/supabase-mcp/.env
  6. Default Settings: Local development defaults (if no other config is found)

⚠️ Important: When using the package installed via pipx or uv, local .env files in your project directory are not detected. You must use either environment variables or the global config file.

Setting Up Configuration

Set environment variables directly in your MCP client configuration (see client-specific setup instructions in Step 3). Most MCP clients support this approach, which keeps your configuration with your client settings.

Option 2: Global Configuration

Create a global .env configuration file that will be used for all MCP server instances:

# Create config directory
# On macOS/Linux
mkdir -p ~/.config/supabase-mcp
# On Windows (PowerShell)
mkdir -Force "$env:APPDATAsupabase-mcp"

# Create and edit .env file
# On macOS/Linux
nano ~/.config/supabase-mcp/.env
# On Windows (PowerShell)
notepad "$env:APPDATAsupabase-mcp.env"

Add your configuration values to the file:

SUPABASE_PROJECT_REF=your-project-ref
SUPABASE_DB_PASSWORD=your-db-password
SUPABASE_REGION=us-east-1
SUPABASE_ACCESS_TOKEN=your-access-token
SUPABASE_SERVICE_ROLE_KEY=your-service-role-key
Option 3: Project-Specific Configuration (Source Installation Only)

If you're running the server from source (not via package), you can create a .env file in your project directory with the same format as above.

Finding Your Supabase Project Information

  • Project Reference: Found in your Supabase project URL: https://supabase.com/dashboard/project/<project-ref>
  • Database Password: Set during project creation or found in Project Settings → Database
  • Access Token: Generate at https://supabase.com/dashboard/account/tokens
  • Service Role Key: Found in Project Settings → API → Project API keys

Supported Regions

The server supports all Supabase regions:

  • us-west-1 - West US (North California)
  • us-east-1 - East US (North Virginia) - default
  • us-east-2 - East US (Ohio)
  • ca-central-1 - Canada (Central)
  • eu-west-1 - West EU (Ireland)
  • eu-west-2 - West Europe (London)
  • eu-west-3 - West EU (Paris)
  • eu-central-1 - Central EU (Frankfurt)
  • eu-central-2 - Central Europe (Zurich)
  • eu-north-1 - North EU (Stockholm)
  • ap-south-1 - South Asia (Mumbai)
  • ap-southeast-1 - Southeast Asia (Singapore)
  • ap-northeast-1 - Northeast Asia (Tokyo)
  • ap-northeast-2 - Northeast Asia (Seoul)
  • ap-southeast-2 - Oceania (Sydney)
  • sa-east-1 - South America (São Paulo)

Limitations

  • No Self-Hosted Support: The server only supports official Supabase.com hosted projects and local development
  • No Connection String Support: Custom connection strings are not supported
  • No Session Pooling: Only transaction pooling is supported for database connections
  • API and SDK Features: Management API and Auth Admin SDK features only work with remote Supabase projects, not local development

Step 3. Usage

In general, any MCP client that supports stdio protocol should work with this MCP server. This server was explicitly tested to work with: - Cursor - Windsurf - Cline - Claude Desktop

Additionally, you can also use smithery.ai to install this server a number of clients, including the ones above.

Follow the guides below to install this MCP server in your client.

Cursor

Go to Settings -> Features -> MCP Servers and add a new server with this configuration:

# can be set to any name
name: supabase
type: command
# if you installed with pipx
command: supabase-mcp-server
# if you installed with uv
command: uv run supabase-mcp-server
# if the above doesn't work, use the full path (recommended)
command: /full/path/to/supabase-mcp-server  # Find with 'which supabase-mcp-server' (macOS/Linux) or 'where supabase-mcp-server' (Windows)

If configuration is correct, you should see a green dot indicator and the number of tools exposed by the server. How successful Cursor config looks like

Windsurf

Go to Cascade -> Click on the hammer icon -> Configure -> Fill in the configuration:

{
    "mcpServers": {
      "supabase": {
        "command": "/Users/username/.local/bin/supabase-mcp-server",  // update path
        "env": {
          "SUPABASE_PROJECT_REF": "your-project-ref",
          "SUPABASE_DB_PASSWORD": "your-db-password",
          "SUPABASE_REGION": "us-east-1",  // optional, defaults to us-east-1
          "SUPABASE_ACCESS_TOKEN": "your-access-token",  // optional, for management API
          "SUPABASE_SERVICE_ROLE_KEY": "your-service-role-key"  // optional, for Auth Admin SDK
        }
      }
    }
}
If configuration is correct, you should see green dot indicator and clickable supabase server in the list of available servers.

How successful Windsurf config looks like

Claude Desktop

Claude Desktop also supports MCP servers through a JSON configuration. Follow these steps to set up the Supabase MCP server:

  1. Find the full path to the executable (this step is critical):

    # On macOS/Linux
    which supabase-mcp-server
    
    # On Windows
    where supabase-mcp-server
    Copy the full path that is returned (e.g., /Users/username/.local/bin/supabase-mcp-server).

  2. Configure the MCP server in Claude Desktop:

  3. Open Claude Desktop
  4. Go to Settings → Developer -> Edit Config MCP Servers
  5. Add a new configuration with the following JSON:
{
  "mcpServers": {
    "supabase": {
      "command": "/full/path/to/supabase-mcp-server",  // Replace with the actual path from step 1
      "env": {
        "SUPABASE_PROJECT_REF": "your-project-ref",
        "SUPABASE_DB_PASSWORD": "your-db-password",
        "SUPABASE_REGION": "us-east-1",  // optional, defaults to us-east-1
        "SUPABASE_ACCESS_TOKEN": "your-access-token",  // optional, for management API
        "SUPABASE_SERVICE_ROLE_KEY": "your-service-role-key"  // optional, for Auth Admin SDK
      }
    }
  }
}

⚠️ Important: Unlike Windsurf and Cursor, Claude Desktop requires the full absolute path to the executable. Using just the command name (supabase-mcp-server) will result in a "spawn ENOENT" error.

If configuration is correct, you should see the Supabase MCP server listed as available in Claude Desktop.

How successful Windsurf config looks like

Cline

Cline also supports MCP servers through a similar JSON configuration. Follow these steps to set up the Supabase MCP server:

  1. Find the full path to the executable (this step is critical):

    # On macOS/Linux
    which supabase-mcp-server
    
    # On Windows
    where supabase-mcp-server
    Copy the full path that is returned (e.g., /Users/username/.local/bin/supabase-mcp-server).

  2. Configure the MCP server in Cline:

  3. Open Cline in VS Code
  4. Click on the "MCP Servers" tab in the Cline sidebar
  5. Click "Configure MCP Servers"
  6. This will open the cline_mcp_settings.json file
  7. Add the following configuration:
{
  "mcpServers": {
    "supabase": {
      "command": "/full/path/to/supabase-mcp-server",  // Replace with the actual path from step 1
      "env": {
        "SUPABASE_PROJECT_REF": "your-project-ref",
        "SUPABASE_DB_PASSWORD": "your-db-password",
        "SUPABASE_REGION": "us-east-1",  // optional, defaults to us-east-1
        "SUPABASE_ACCESS_TOKEN": "your-access-token",  // optional, for management API
        "SUPABASE_SERVICE_ROLE_KEY": "your-service-role-key"  // optional, for Auth Admin SDK
      }
    }
  }
}

If configuration is correct, you should see a green indicator next to the Supabase MCP server in the Cline MCP Servers list, and a message confirming "supabase MCP server connected" at the bottom of the panel.

How successful configuration in Cline looks like

Troubleshooting

Here are some tips & tricks that might help you: - Debug installation - run supabase-mcp-server directly from the terminal to see if it works. If it doesn't, there might be an issue with the installation. - MCP Server configuration - if the above step works, it means the server is installed and configured correctly. As long as you provided the right command, IDE should be able to connect. Make sure to provide the right path to the server executable. - "No tools found" error - If you see "Client closed - no tools available" in Cursor despite the package being installed: - Find the full path to the executable by running which supabase-mcp-server (macOS/Linux) or where supabase-mcp-server (Windows) - Use the full path in your MCP server configuration instead of just supabase-mcp-server - For example: /Users/username/.local/bin/supabase-mcp-server or C:Usersusername.localinsupabase-mcp-server.exe - Environment variables - to connect to the right database, make sure you either set env variables in mcp_config.json or in .env file placed in a global config directory (~/.config/supabase-mcp/.env on macOS/Linux or %APPDATA%supabase-mcp.env on Windows). - Accessing logs - The MCP server writes detailed logs to a file: - Log file location: - macOS/Linux: ~/.local/share/supabase-mcp/mcp_server.log - Windows: %USERPROFILE%.localsharesupabase-mcpmcp_server.log - Logs include connection status, configuration details, and operation results - View logs using any text editor or terminal commands:

# On macOS/Linux
cat ~/.local/share/supabase-mcp/mcp_server.log

# On Windows (PowerShell)
Get-Content "$env:USERPROFILE.localsharesupabase-mcpmcp_server.log"

If you are stuck or any of the instructions above are incorrect, please raise an issue.

MCP Inspector

A super useful tool to help debug MCP server issues is MCP Inspector. If you installed from source, you can run supabase-mcp-inspector from the project repo and it will run the inspector instance. Coupled with logs this will give you complete overview over what's happening in the server.

? Running supabase-mcp-inspector, if installed from package, doesn't work properly - I will validate and fix in the coming release.

Feature Overview

Database query tools

Since v0.3+ server provides comprehensive database management capabilities with built-in safety controls:

  • SQL Query Execution: Execute PostgreSQL queries with risk assessment
  • Three-tier safety system:

    • safe: Read-only operations (SELECT) - always allowed
    • write: Data modifications (INSERT, UPDATE, DELETE) - require unsafe mode
    • destructive: Schema changes (DROP, CREATE) - require unsafe mode + confirmation
  • SQL Parsing and Validation:

  • Uses PostgreSQL's parser (pglast) for accurate analysis and provides clear feedback on safety requirements

  • Automatic Migration Versioning:

  • Database-altering operations operations are automatically versioned
  • Generates descriptive names based on operation type and target

  • Safety Controls:

  • Default SAFE mode allows only read-only operations
  • All statements run in transaction mode via asyncpg
  • 2-step confirmation for high-risk operations

  • Available Tools:

  • get_schemas: Lists schemas with sizes and table counts
  • get_tables: Lists tables, foreign tables, and views with metadata
  • get_table_schema: Gets detailed table structure (columns, keys, relationships)
  • execute_postgresql: Executes SQL statements against your database
  • confirm_destructive_operation: Executes high-risk operations after confirmation
  • retrieve_migrations: Gets migrations with filtering and pagination options
  • live_dangerously: Toggles between safe and unsafe modes

Management API tools

Since v0.3.0 server provides secure access to the Supabase Management API with built-in safety controls:

  • Available Tools:
  • send_management_api_request: Sends arbitrary requests to Supabase Management API with auto-injection of project ref
  • get_management_api_spec: Gets the enriched API specification with safety information
    • Supports multiple query modes: by domain, by specific path/method, or all paths
    • Includes risk assessment information for each endpoint
    • Provides detailed parameter requirements and response formats
    • Helps LLMs understand the full capabilities of the Supabase Management API
  • get_management_api_safety_rules: Gets all safety rules with human-readable explanations
  • live_dangerously: Toggles between safe and unsafe operation modes

  • Safety Controls:

  • Uses the same safety manager as database operations for consistent risk management
  • Operations categorized by risk level:
    • safe: Read-only operations (GET) - always allowed
    • unsafe: State-changing operations (POST, PUT, PATCH, DELETE) - require unsafe mode
    • blocked: Destructive operations (delete project, etc.) - never allowed
  • Default safe mode prevents accidental state changes
  • Path-based pattern matching for precise safety rules

Note: Management API tools only work with remote Supabase instances and are not compatible with local Supabase development setups.

Auth Admin tools

I was planning to add support for Python SDK methods to the MCP server. Upon consideration I decided to only add support for Auth admin methods as I often found myself manually creating test users which was prone to errors and time consuming. Now I can just ask Cursor to create a test user and it will be done seamlessly. Check out the full Auth Admin SDK method docs to know what it can do.

Since v0.3.6 server supports direct access to Supabase Auth Admin methods via Python SDK: - Includes the following tools: - get_auth_admin_methods_spec to retrieve documentation for all available Auth Admin methods - call_auth_admin_method to directly invoke Auth Admin methods with proper parameter handling - Supported methods: - get_user_by_id: Retrieve a user by their ID - list_users: List all users with pagination - create_user: Create a new user - delete_user: Delete a user by their ID - invite_user_by_email: Send an invite link to a user's email - generate_link: Generate an email link for various authentication purposes - update_user_by_id: Update user attributes by ID - delete_factor: Delete a factor on a user (currently not implemented in SDK)

Why use Auth Admin SDK instead of raw SQL queries?

The Auth Admin SDK provides several key advantages over direct SQL manipulation: - Functionality: Enables operations not possible with SQL alone (invites, magic links, MFA) - Accuracy: More reliable then creating and executing raw SQL queries on auth schemas - Simplicity: Offers clear methods with proper validation and error handling

  • Response format:
    • All methods return structured Python objects instead of raw dictionaries
    • Object attributes can be accessed using dot notation (e.g., user.id instead of user["id"])
  • Edge cases and limitations:
    • UUID validation: Many methods require valid UUID format for user IDs and will return specific validation errors
    • Email configuration: Methods like invite_user_by_email and generate_link require email sending to be configured in your Supabase project
    • Link types: When generating links, different link types have different requirements:
    • signup links don't require the user to exist
    • magiclink and recovery links require the user to already exist in the system
    • Error handling: The server provides detailed error messages from the Supabase API, which may differ from the dashboard interface
    • Method availability: Some methods like delete_factor are exposed in the API but not fully implemented in the SDK

Logs & Analytics

The server provides access to Supabase logs and analytics data, making it easier to monitor and troubleshoot your applications:

  • Available Tool: retrieve_logs - Access logs from any Supabase service

  • Log Collections:

  • postgres: Database server logs
  • api_gateway: API gateway requests
  • auth: Authentication events
  • postgrest: RESTful API service logs
  • pooler: Connection pooling logs
  • storage: Object storage operations
  • realtime: WebSocket subscription logs
  • edge_functions: Serverless function executions
  • cron: Scheduled job logs
  • pgbouncer: Connection pooler logs

  • Features: Filter by time, search text, apply field filters, or use custom SQL queries

Simplifies debugging across your Supabase stack without switching between interfaces or writing complex queries.

Automatic Versioning of Database Changes

"With great power comes great responsibility." While execute_postgresql tool coupled with aptly named live_dangerously tool provide a powerful and simple way to manage your Supabase database, it also means that dropping a table or modifying one is one chat message away. In order to reduce the risk of irreversible changes, since v0.3.8 the server supports: - automatic creation of migration scripts for all write & destructive sql operations executed on the database - improved safety mode of query execution, in which all queries are categorized in: - safe type: always allowed. Includes all read-only ops. - writetype: requires write mode to be enabled by the user. - destructive type: requires write mode to be enabled by the user AND a 2-step confirmation of query execution for clients that do not execute tools automatically.

Universal Safety Mode

Since v0.3.8 Safety Mode has been standardized across all services (database, API, SDK) using a universal safety manager. This provides consistent risk management and a unified interface for controlling safety settings across the entire MCP server.

All operations (SQL queries, API requests, SDK methods) are categorized into risk levels: - Low risk: Read-only operations that don't modify data or structure (SELECT queries, GET API requests) - Medium risk: Write operations that modify data but not structure (INSERT/UPDATE/DELETE, most POST/PUT API requests) - High risk: Destructive operations that modify database structure or could cause data loss (DROP/TRUNCATE, DELETE API endpoints) - Extreme risk: Operations with severe consequences that are blocked entirely (deleting projects)

Safety controls are applied based on risk level: - Low risk operations are always allowed - Medium risk operations require unsafe mode to be enabled - High risk operations require unsafe mode AND explicit confirmation - Extreme risk operations are never allowed

How confirmation flow works

Any high-risk operations (be it a postgresql or api request) will be blocked even in unsafe mode. Every high-risk operation is blocked You will have to confirm and approve every high-risk operation explicitly in order for it to be executed. Explicit approval is always required

Changelog

  • ? Simplified installation via package manager - ✅ (v0.2.0)
  • ? Support for different Supabase regions - ✅ (v0.2.2)
  • ? Programmatic access to Supabase management API with safety controls - ✅ (v0.3.0)
  • ?‍♂️ Read and read-write database SQL queries with safety controls - ✅ (v0.3.0)
  • ? Robust transaction handling for both direct and pooled connections - ✅ (v0.3.2)
  • ? Support methods and objects available in native Python SDK - ✅ (v0.3.6)
  • ? Stronger SQL query validation ✅ (v0.3.8)
  • ? Automatic versioning of database changes ✅ (v0.3.8)
  • ? Radically improved knowledge and tools of api spec ✅ (v0.3.8)
  • ✍️ Improved consistency of migration-related tools for a more organized database vcs ✅ (v0.3.10)

For a more detailed roadmap, please see this discussion on GitHub.

Star History

Star History Chart


Enjoy! ☺️

[
  {
    "description": "List all database schemas with their sizes and table counts.",
    "inputSchema": {
      "properties": {},
      "title": "get_schemasArguments",
      "type": "object"
    },
    "name": "get_schemas"
  },
  {
    "description": "List all tables, foreign tables, and views in a schema with their sizes, row counts, and metadata.nnProvides detailed information about all database objects in the specified schema:n- Table/view namesn- Object types (table, view, foreign table)n- Row countsn- Size on diskn- Column countsn- Index informationn- Last vacuum/analyze timesnnParameters:n- schema_name: Name of the schema to inspect (e.g., 'public', 'auth', etc.)nnSAFETY: This is a low-risk read operation that can be executed in SAFE mode.n",
    "inputSchema": {
      "properties": {
        "schema_name": {
          "title": "Schema Name",
          "type": "string"
        }
      },
      "required": [
        "schema_name"
      ],
      "title": "get_tablesArguments",
      "type": "object"
    },
    "name": "get_tables"
  },
  {
    "description": "Get detailed table structure including columns, keys, and relationships.nnReturns comprehensive information about a specific table's structure:n- Column definitions (names, types, constraints)n- Primary key informationn- Foreign key relationshipsn- Indexesn- Constraintsn- TriggersnnParameters:n- schema_name: Name of the schema (e.g., 'public', 'auth')n- table: Name of the table to inspectnnSAFETY: This is a low-risk read operation that can be executed in SAFE mode.n",
    "inputSchema": {
      "properties": {
        "schema_name": {
          "title": "Schema Name",
          "type": "string"
        },
        "table": {
          "title": "Table",
          "type": "string"
        }
      },
      "required": [
        "schema_name",
        "table"
      ],
      "title": "get_table_schemaArguments",
      "type": "object"
    },
    "name": "get_table_schema"
  },
  {
    "description": "Execute PostgreSQL statements against your Supabase database.nnIMPORTANT: All SQL statements must end with a semicolon (;).nnOPERATION TYPES AND REQUIREMENTS:n1. READ Operations (SELECT, EXPLAIN, etc.):n   - Can be executed directly without special requirementsn   - Example: SELECT * FROM public.users LIMIT 10;nn2. WRITE Operations (INSERT, UPDATE, DELETE):n   - Require UNSAFE mode (use live_dangerously('database', True) first)n   - Example:n     INSERT INTO public.users (email) VALUES ('[email protected]');nn3. SCHEMA Operations (CREATE, ALTER, DROP):n   - Require UNSAFE mode (use live_dangerously('database', True) first)n   - Destructive operations (DROP, TRUNCATE) require additional confirmationn   - Example:n     CREATE TABLE public.test_table (id SERIAL PRIMARY KEY, name TEXT);nnMIGRATION HANDLING:nAll queries that modify the database will be automatically version controlled by the server. You can provide optional migration name, if you want to name the migration.n - Respect the following format: verb_noun_detail. Be descriptive and concise.n - Examples:n   - create_users_tablen   - add_email_to_profilesn   - enable_rls_on_usersn - If you don't provide a migration name, the server will generate one based on the SQL statementn - The system will sanitize your provided name to ensure compatibility with database systemsn - Migration names are prefixed with a timestamp in the format YYYYMMDDHHMMSSnnSAFETY SYSTEM:nOperations are categorized by risk level:n- LOW RISK: Read operations (SELECT, EXPLAIN) - allowed in SAFE moden- MEDIUM RISK: Write operations (INSERT, UPDATE, DELETE) - require UNSAFE moden- HIGH RISK: Schema operations (CREATE, ALTER) - require UNSAFE moden- EXTREME RISK: Destructive operations (DROP, TRUNCATE) - require UNSAFE mode and confirmationnnTRANSACTION HANDLING:n- DO NOT use transaction control statements (BEGIN, COMMIT, ROLLBACK)n- The database client automatically wraps queries in transactionsn- The SQL validator will reject queries containing transaction control statementsn- This ensures atomicity and provides rollback capability for data modificationsnnMULTIPLE STATEMENTS:n- You can send multiple SQL statements in a single queryn- Each statement will be executed in order within the same transactionn- Example:n  CREATE TABLE public.test_table (id SERIAL PRIMARY KEY, name TEXT);n  INSERT INTO public.test_table (name) VALUES ('test');nnCONFIRMATION FLOW FOR HIGH-RISK OPERATIONS:n- High-risk operations (DROP TABLE, TRUNCATE, etc.) will be rejected with a confirmation IDn- The error message will explain what happened and provide a confirmation IDn- Review the risks with the user before proceedingn- Use the confirm_destructive_operation tool with the provided ID to execute the operationnnIMPORTANT GUIDELINES:n- The database client starts in SAFE mode by default for safetyn- Only enable UNSAFE mode when you need to modify data or scheman- Never mix READ and WRITE operations in the same transactionn- For destructive operations, be prepared to confirm with the confirm_destructive_operation toolnnWHEN TO USE OTHER TOOLS INSTEAD:n- For Auth operations (users, authentication, etc.): Use call_auth_admin_method instead of direct SQLn  The Auth Admin SDK provides safer, validated methods for user managementn- For project configuration, functions, storage, etc.: Use send_management_api_requestn  The Management API handles Supabase platform features that aren't directly in the databasennNote: This tool operates on the PostgreSQL database only. API operations use separate safety controls.n",
    "inputSchema": {
      "properties": {
        "migration_name": {
          "default": "",
          "title": "Migration Name",
          "type": "string"
        },
        "query": {
          "title": "Query",
          "type": "string"
        }
      },
      "required": [
        "query"
      ],
      "title": "execute_postgresqlArguments",
      "type": "object"
    },
    "name": "execute_postgresql"
  },
  {
    "description": "Retrieve a list of all migrations a user has from Supabase.nnReturns a list of migrations with the following information:n- Version (timestamp)n- Namen- SQL statements (if requested)n- Statement countn- Version type (named or numbered)nnParameters:n- limit: Maximum number of migrations to return (default: 50, max: 100)n- offset: Number of migrations to skip for pagination (default: 0)n- name_pattern: Optional pattern to filter migrations by name. Uses SQL ILIKE pattern matching (case-insensitive).n  The pattern is automatically wrapped with '%' wildcards, so "users" will match "create_users_table",n  "add_email_to_users", etc. To search for an exact match, use the complete name.n- include_full_queries: Whether to include the full SQL statements in the result (default: false)nnSAFETY: This is a low-risk read operation that can be executed in SAFE mode.n",
    "inputSchema": {
      "properties": {
        "include_full_queries": {
          "default": false,
          "title": "Include Full Queries",
          "type": "boolean"
        },
        "limit": {
          "default": 50,
          "title": "Limit",
          "type": "integer"
        },
        "name_pattern": {
          "default": "",
          "title": "Name Pattern",
          "type": "string"
        },
        "offset": {
          "default": 0,
          "title": "Offset",
          "type": "integer"
        }
      },
      "title": "retrieve_migrationsArguments",
      "type": "object"
    },
    "name": "retrieve_migrations"
  },
  {
    "description": "Execute a Supabase Management API request.nnThis tool allows you to make direct calls to the Supabase Management API, which providesnprogrammatic access to manage your Supabase project settings, resources, and configurations.nnREQUEST FORMATTING:n- Use paths exactly as defined in the API specificationn- The {ref} parameter will be automatically injected from settingsn- Format request bodies according to the API specificationnnPARAMETERS:n- method: HTTP method (GET, POST, PUT, PATCH, DELETE)n- path: API path (e.g. /v1/projects/{ref}/functions)n- path_params: Path parameters as dict (e.g. {"function_slug": "my-function"}) - use empty dict {} if not neededn- request_params: Query parameters as dict (e.g. {"key": "value"}) - use empty dict {} if not neededn- request_body: Request body as dict (e.g. {"name": "test"}) - use empty dict {} if not needednnPATH PARAMETERS HANDLING:n- The {ref} placeholder (project reference) is automatically injected - you don't need to provide itn- All other path placeholders must be provided in the path_params dictionaryn- Common placeholders include:n  * {function_slug}: For Edge Functions operationsn  * {id}: For operations on specific resources (API keys, auth providers, etc.)n  * {slug}: For organization operationsn  * {branch_id}: For database branch operationsn  * {provider_id}: For SSO provider operationsn  * {tpa_id}: For third-party auth operationsnnEXAMPLES:n1. GET request with path and query parameters:n   method: "GET"n   path: "/v1/projects/{ref}/functions/{function_slug}"n   path_params: {"function_slug": "my-function"}n   request_params: {"version": "1"}n   request_body: {}nn2. POST request with body:n   method: "POST"n   path: "/v1/projects/{ref}/functions"n   path_params: {}n   request_params: {}n   request_body: {"name": "test-function", "slug": "test-function"}nnSAFETY SYSTEM:nAPI operations are categorized by risk level:n- LOW RISK: Read operations (GET) - allowed in SAFE moden- MEDIUM/HIGH RISK: Write operations (POST, PUT, PATCH, DELETE) - require UNSAFE moden- EXTREME RISK: Destructive operations - require UNSAFE mode and confirmationn- BLOCKED: Some operations are completely blocked for safety reasonsnnSAFETY CONSIDERATIONS:n- By default, the API client starts in SAFE mode, allowing only read operationsn- To perform write operations, first use live_dangerously(service="api", enable=True)n- High-risk operations will be rejected with a confirmation IDn- Use confirm_destructive_operation with the provided ID after reviewing risksn- Some operations may be completely blocked for safety reasonsnnFor a complete list of available API endpoints and their parameters, use the get_management_api_spec tool.nFor details on safety rules, use the get_management_api_safety_rules tool.n",
    "inputSchema": {
      "properties": {
        "method": {
          "title": "Method",
          "type": "string"
        },
        "path": {
          "title": "Path",
          "type": "string"
        },
        "path_params": {
          "additionalProperties": {
            "type": "string"
          },
          "title": "Path Params",
          "type": "object"
        },
        "request_body": {
          "title": "Request Body",
          "type": "object"
        },
        "request_params": {
          "title": "Request Params",
          "type": "object"
        }
      },
      "required": [
        "method",
        "path",
        "path_params",
        "request_params",
        "request_body"
      ],
      "title": "send_management_api_requestArguments",
      "type": "object"
    },
    "name": "send_management_api_request"
  },
  {
    "description": "Get the complete Supabase Management API specification.nnReturns the full OpenAPI specification for the Supabase Management API, including:n- All available endpoints and operationsn- Required and optional parameters for each operationn- Request and response schemasn- Authentication requirementsn- Safety information for each operationnnThis tool can be used in four different ways:n1. Without parameters: Returns all domains (default)n2. With path and method: Returns the full specification for a specific API endpointn3. With domain only: Returns all paths and methods within that domainn4. With all_paths=True: Returns all paths and methodsnnParameters:n- params: Dictionary containing optional parameters:n    - path: Optional API path (e.g., "/v1/projects/{ref}/functions")n    - method: Optional HTTP method (e.g., "GET", "POST")n    - domain: Optional domain/tag name (e.g., "Auth", "Storage")n    - all_paths: Optional boolean, if True returns all paths and methodsnnAvailable domains:n- Analytics: Analytics-related endpointsn- Auth: Authentication and authorization endpointsn- Database: Database management endpointsn- Domains: Custom domain configuration endpointsn- Edge Functions: Serverless function management endpointsn- Environments: Environment configuration endpointsn- OAuth: OAuth integration endpointsn- Organizations: Organization management endpointsn- Projects: Project management endpointsn- Rest: RESTful API endpointsn- Secrets: Secret management endpointsn- Storage: Storage management endpointsnnThis specification is useful for understanding:n- What operations are available through the Management APIn- How to properly format requests for each endpointn- Which operations require unsafe moden- What data structures to expect in responsesnnSAFETY: This is a low-risk read operation that can be executed in SAFE mode.n",
    "inputSchema": {
      "properties": {
        "params": {
          "default": {},
          "title": "Params",
          "type": "object"
        }
      },
      "title": "get_management_api_specArguments",
      "type": "object"
    },
    "name": "get_management_api_spec"
  },
  {
    "description": "Get Python SDK methods specification for Auth Admin.nnReturns a comprehensive dictionary of all Auth Admin methods available in the Supabase Python SDK, including:n- Method names and descriptionsn- Required and optional parameters for each methodn- Parameter types and constraintsn- Return value informationnnThis tool is useful for exploring the capabilities of the Auth Admin SDK and understandingnhow to properly format parameters for the call_auth_admin_method tool.nnNo parameters required.n",
    "inputSchema": {
      "properties": {},
      "title": "get_auth_admin_methods_specArguments",
      "type": "object"
    },
    "name": "get_auth_admin_methods_spec"
  },
  {
    "description": "Call an Auth Admin method from Supabase Python SDK.nnThis tool provides a safe, validated interface to the Supabase Auth Admin SDK, allowing you to:n- Manage users (create, update, delete)n- List and search usersn- Generate authentication linksn- Manage multi-factor authenticationn- And morennIMPORTANT NOTES:n- Request bodies must adhere to the Python SDK specificationn- Some methods may have nested parameter structuresn- The tool validates all parameters against Pydantic modelsn- Extra fields not defined in the models will be rejectednnAVAILABLE METHODS:n- get_user_by_id: Retrieve a user by their IDn- list_users: List all users with paginationn- create_user: Create a new usern- delete_user: Delete a user by their IDn- invite_user_by_email: Send an invite link to a user's emailn- generate_link: Generate an email link for various authentication purposesn- update_user_by_id: Update user attributes by IDn- delete_factor: Delete a factor on a usernnEXAMPLES:n1. Get user by ID:n   method: "get_user_by_id"n   params: {"uid": "user-uuid-here"}nn2. Create user:n   method: "create_user"n   params: {n     "email": "[email protected]",n     "password": "secure-password"n   }nn3. Update user by ID:n   method: "update_user_by_id"n   params: {n     "uid": "user-uuid-here",n     "attributes": {n       "email": "[email protected]"n     }n   }nnFor complete documentation of all methods and their parameters, use the get_auth_admin_methods_spec tool.n",
    "inputSchema": {
      "properties": {
        "method": {
          "title": "Method",
          "type": "string"
        },
        "params": {
          "title": "Params",
          "type": "object"
        }
      },
      "required": [
        "method",
        "params"
      ],
      "title": "call_auth_admin_methodArguments",
      "type": "object"
    },
    "name": "call_auth_admin_method"
  },
  {
    "description": "Toggle unsafe mode for either Management API or Database operations.nnWHAT THIS TOOL DOES:nThis tool switches between safe (default) and unsafe operation modes for either the Management API or Database operations.nnSAFETY MODES EXPLAINED:n1. Database Safety Modes:n   - SAFE mode (default): Only low-risk operations like SELECT queries are allowedn   - UNSAFE mode: Higher-risk operations including INSERT, UPDATE, DELETE, and schema changes are permittednn2. API Safety Modes:n   - SAFE mode (default): Only low-risk operations that don't modify state are allowedn   - UNSAFE mode: Higher-risk state-changing operations are permitted (except those explicitly blocked for safety)nnOPERATION RISK LEVELS:nThe system categorizes operations by risk level:n- LOW: Safe read operations with minimal impactn- MEDIUM: Write operations that modify data but don't change structuren- HIGH: Operations that modify database structure or important system settingsn- EXTREME: Destructive operations that could cause data loss or service disruptionnnWHEN TO USE THIS TOOL:n- Use this tool BEFORE attempting write operations or schema changesn- Enable unsafe mode only when you need to perform data modificationsn- Always return to safe mode after completing write operationsnnUSAGE GUIDELINES:n- Start in safe mode by default for exploration and analysisn- Switch to unsafe mode only when you need to make changesn- Be specific about which service you're enabling unsafe mode forn- Consider the risks before enabling unsafe mode, especially for database operationsn- For database operations requiring schema changes, you'll need to enable unsafe mode firstnnParameters:n- service: Which service to toggle ("api" or "database")n- enable_unsafe_mode: True to enable unsafe mode, False for safe mode (default: False)nnExamples:n1. Enable database unsafe mode:n   live_dangerously(service="database", enable_unsafe_mode=True)nn2. Return to safe mode after operations:n   live_dangerously(service="database", enable_unsafe_mode=False)nn3. Enable API unsafe mode:n   live_dangerously(service="api", enable_unsafe_mode=True)nnNote: This tool affects ALL subsequent operations for the specified service until changed again.n",
    "inputSchema": {
      "properties": {
        "enable_unsafe_mode": {
          "default": false,
          "title": "Enable Unsafe Mode",
          "type": "boolean"
        },
        "service": {
          "enum": [
            "api",
            "database"
          ],
          "title": "Service",
          "type": "string"
        }
      },
      "required": [
        "service"
      ],
      "title": "live_dangerouslyArguments",
      "type": "object"
    },
    "name": "live_dangerously"
  },
  {
    "description": "Execute a destructive database or API operation after confirmation. Use this only after reviewing the risks with the user.nnHOW IT WORKS:n- This tool executes a previously rejected high-risk operation using its confirmation IDn- The operation will be exactly the same as the one that generated the IDn- No need to retype the query or api request params - the system remembers itnnSTEPS:n1. Explain the risks to the user and get their approvaln2. Use this tool with the confirmation ID from the error messagen3. The original query will be executed as-isnnPARAMETERS:n- operation_type: Type of operation ("api" or "database")n- confirmation_id: The ID provided in the error message (required)n- user_confirmation: Set to true to confirm execution (default: false)nnNOTE: Confirmation IDs expire after 5 minutes for securityn",
    "inputSchema": {
      "properties": {
        "confirmation_id": {
          "title": "Confirmation Id",
          "type": "string"
        },
        "operation_type": {
          "enum": [
            "api",
            "database"
          ],
          "title": "Operation Type",
          "type": "string"
        },
        "user_confirmation": {
          "default": false,
          "title": "User Confirmation",
          "type": "boolean"
        }
      },
      "required": [
        "operation_type",
        "confirmation_id"
      ],
      "title": "confirm_destructive_operationArguments",
      "type": "object"
    },
    "name": "confirm_destructive_operation"
  }
]