postgresql mcp server
A Model Context Protocol server that enables powerful PostgreSQL database management capabilities including analysis, schema management, data migration, and monitoring through natural language interactions.
A Model Context Protocol server that enables powerful PostgreSQL database management capabilities including analysis, schema management, data migration, and monitoring through natural language interactions.
A Model Context Protocol (MCP) server that provides PostgreSQL database management capabilities. This server assists with analyzing existing PostgreSQL setups, providing implementation guidance, debugging database issues, managing schemas, migrating data, and monitoring database performance.
analyze_database
)Analyzes PostgreSQL database configuration and performance metrics: - Configuration analysis - Performance metrics - Security assessment - Recommendations for optimization
// Example usage
{
"connectionString": "postgresql://user:password@localhost:5432/dbname",
"analysisType": "performance" // Optional: "configuration" | "performance" | "security"
}
get_setup_instructions
)Provides step-by-step PostgreSQL installation and configuration guidance: - Platform-specific installation steps - Configuration recommendations - Security best practices - Post-installation tasks
// Example usage
{
"platform": "linux", // Required: "linux" | "macos" | "windows"
"version": "15", // Optional: PostgreSQL version
"useCase": "production" // Optional: "development" | "production"
}
debug_database
)Debug common PostgreSQL issues: - Connection problems - Performance bottlenecks - Lock conflicts - Replication status
// Example usage
{
"connectionString": "postgresql://user:password@localhost:5432/dbname",
"issue": "performance", // Required: "connection" | "performance" | "locks" | "replication"
"logLevel": "debug" // Optional: "info" | "debug" | "trace"
}
get_schema_info
)Get detailed schema information for a database or specific table: - List of tables in a database - Column definitions - Constraints (primary keys, foreign keys, etc.) - Indexes
// Example usage
{
"connectionString": "postgresql://user:password@localhost:5432/dbname",
"tableName": "users" // Optional: specific table to get info for
}
create_table
)Create a new table with specified columns: - Define column names and types - Set nullable constraints - Set default values
// Example usage
{
"connectionString": "postgresql://user:password@localhost:5432/dbname",
"tableName": "users",
"columns": [
{ "name": "id", "type": "SERIAL", "nullable": false },
{ "name": "username", "type": "VARCHAR(100)", "nullable": false },
{ "name": "email", "type": "VARCHAR(255)", "nullable": false },
{ "name": "created_at", "type": "TIMESTAMP", "default": "NOW()" }
]
}
alter_table
)Modify existing tables: - Add new columns - Modify column types or constraints - Drop columns
// Example usage
{
"connectionString": "postgresql://user:password@localhost:5432/dbname",
"tableName": "users",
"operations": [
{ "type": "add", "columnName": "last_login", "dataType": "TIMESTAMP" },
{ "type": "alter", "columnName": "email", "nullable": false },
{ "type": "drop", "columnName": "temporary_field" }
]
}
export_table_data
)Export table data to JSON or CSV format: - Filter data with WHERE clause - Limit number of rows - Choose output format
// Example usage
{
"connectionString": "postgresql://user:password@localhost:5432/dbname",
"tableName": "users",
"outputPath": "./exports/users.json",
"where": "created_at > '2023-01-01'", // Optional
"limit": 1000, // Optional
"format": "json" // Optional: "json" | "csv"
}
import_table_data
)Import data from JSON or CSV files: - Optionally truncate table before import - Support for different formats - Custom CSV delimiters
// Example usage
{
"connectionString": "postgresql://user:password@localhost:5432/dbname",
"tableName": "users",
"inputPath": "./imports/users.json",
"truncateFirst": false, // Optional
"format": "json", // Optional: "json" | "csv"
"delimiter": "," // Optional: for CSV files
}
copy_between_databases
)Copy data between two PostgreSQL databases: - Filter data with WHERE clause - Optionally truncate target table
// Example usage
{
"sourceConnectionString": "postgresql://user:password@localhost:5432/source_db",
"targetConnectionString": "postgresql://user:password@localhost:5432/target_db",
"tableName": "users",
"where": "active = true", // Optional
"truncateTarget": false // Optional
}
monitor_database
)Real-time monitoring of PostgreSQL database: - Database metrics (connections, cache hit ratio, etc.) - Table metrics (size, row counts, dead tuples) - Active query information - Lock information - Replication status - Configurable alerts
// Example usage
{
"connectionString": "postgresql://user:password@localhost:5432/dbname",
"includeTables": true, // Optional
"includeQueries": true, // Optional
"includeLocks": true, // Optional
"includeReplication": false, // Optional
"alertThresholds": { // Optional
"connectionPercentage": 80,
"longRunningQuerySeconds": 30,
"cacheHitRatio": 0.95,
"deadTuplesPercentage": 10,
"vacuumAge": 7
}
}
npm install
npm run build
{
"mcpServers": {
"postgresql-mcp": {
"command": "node",
"args": ["/path/to/postgresql-mcp-server/build/index.js"],
"disabled": false,
"alwaysAllow": []
}
}
}
npm run dev
- Start development server with hot reloadnpm run lint
- Run ESLintnpm test
- Run testsSupports SSL/TLS connections
Query Safety
Logs all operations
Authentication
The server implements comprehensive error handling: - Connection failures - Query timeouts - Authentication errors - Permission issues - Resource constraints
This project is licensed under the AGPLv3 License - see LICENSE file for details.