← Back to Articles
mcppostgrespostgresqldatabasecursorsetup2026sql

PostgreSQL MCP Server: Connect Your AI to Any Database (2026)

Set up the PostgreSQL MCP server in Cursor or Windsurf and let your AI query tables, write SQL, explain schemas, and generate migrations — without copy-pasting data. Full setup guide for 2026.

By Web MCP GuideMarch 23, 20268 min read


PostgreSQL MCP Server: Connect Your AI to Any Database (2026)

One of the most powerful — and underused — MCP use cases is giving your AI direct read access to your database. Instead of copying table schemas into your chat window, then pasting query results back in, you configure an MCP server once and your AI can explore your schema, write SQL, explain relationships, and debug queries autonomously.

This guide covers setting up the PostgreSQL MCP server in Cursor or Windsurf, configuring permissions safely, and practical examples of what it unlocks.

---

What the PostgreSQL MCP Server Does

The @modelcontextprotocol/server-postgres package exposes several MCP tools:

  • query — execute read-only SQL queries against your database

  • list_tables — enumerate all tables in a schema

  • describe_table — get column names, types, constraints for a specific table

  • list_schemas — list available schemas in the database
  • When connected, your AI can independently explore your database to answer questions like:

  • "Which users signed up in the last 30 days and haven't completed onboarding?"

  • "Write a migration to add an archived_at column to the posts table"

  • "Why is this query slow? Show me the table definition"

  • "How many orders per day last week? Generate a chart-ready dataset"
  • ---

    Prerequisites


  • Node.js 18+ installed (node --version to check)

  • A running PostgreSQL instance (local or remote)

  • Your database connection string (format: postgresql://user:password@host:5432/dbname)

  • Cursor 0.40+ or Windsurf 1.0+
  • ---

    Step 1: Install the PostgreSQL MCP Server

    The official server from Anthropic's MCP team:

    npx -y @modelcontextprotocol/server-postgres --help

    This verifies it's accessible. No global install required — npx handles it on first run.

    For faster repeated starts, install globally:

    npm install -g @modelcontextprotocol/server-postgres

    ---

    Step 2: Add to Your MCP Config

    In Cursor

    Edit ~/.cursor/mcp.json (global) or .cursor/mcp.json (project-level):

    {
    "mcpServers": {
    "postgres": {
    "command": "npx",
    "args": [
    "-y",
    "@modelcontextprotocol/server-postgres",
    "postgresql://myuser:mypassword@localhost:5432/mydb"
    ]
    }
    }
    }

    Restart Cursor. Check Settings → MCP — you should see postgres listed with a green indicator.

    In Windsurf

    Open Settings → MCP Servers → Add Server and fill in:

  • Name: postgres

  • Command: npx

  • Args: -y @modelcontextprotocol/server-postgres postgresql://myuser:mypassword@localhost:5432/mydb
  • Or edit ~/.codeium/windsurf/mcp_config.json directly:

    {
    "mcpServers": {
    "postgres": {
    "command": "npx",
    "args": [
    "-y",
    "@modelcontextprotocol/server-postgres",
    "postgresql://myuser:mypassword@localhost:5432/mydb"
    ]
    }
    }
    }

    ---

    Step 3: Use Environment Variables (Recommended)

    Never hardcode database credentials in your config files, especially if you commit .cursor/mcp.json to version control. Use environment variables instead:

    {
    "mcpServers": {
    "postgres": {
    "command": "npx",
    "args": ["-y", "@modelcontextprotocol/server-postgres"],
    "env": {
    "DATABASE_URL": "postgresql://myuser:mypassword@localhost:5432/mydb"
    }
    }
    }
    }

    Then set DATABASE_URL in your shell's .bashrc / .zshrc / .env file. The MCP server picks it up from the env block.

    For project-level secrets, use a .env file at your project root and reference it in the args:

    {
    "mcpServers": {
    "postgres-dev": {
    "command": "sh",
    "args": ["-c", "source .env && npx -y @modelcontextprotocol/server-postgres $DATABASE_URL"]
    }
    }
    }

    ---

    Step 4: Create a Read-Only Database User (Strongly Recommended)

    Before connecting your AI to any production database, create a dedicated read-only user:

    -- Create read-only user
    CREATE USER mcp_reader WITH PASSWORD 'secure_random_password';

    -- Grant connection to database
    GRANT CONNECT ON DATABASE your_database TO mcp_reader;

    -- Grant usage on schema
    GRANT USAGE ON SCHEMA public TO mcp_reader;

    -- Grant SELECT on all existing tables
    GRANT SELECT ON ALL TABLES IN SCHEMA public TO mcp_reader;

    -- Grant SELECT on future tables
    ALTER DEFAULT PRIVILEGES IN SCHEMA public
    GRANT SELECT ON TABLES TO mcp_reader;

    Use mcp_reader's credentials in your MCP config. This ensures:

  • The AI can never modify your data via MCP

  • Accidental write queries fail gracefully

  • You can audit MCP activity separately in PostgreSQL logs
  • ---

    Step 5: Multiple Database Configurations

    You can run multiple postgres MCP servers simultaneously — one for dev, one for staging, one for analytics:

    {
    "mcpServers": {
    "postgres-dev": {
    "command": "npx",
    "args": ["-y", "@modelcontextprotocol/server-postgres"],
    "env": {
    "DATABASE_URL": "postgresql://user:pass@localhost:5432/myapp_dev"
    }
    },
    "postgres-analytics": {
    "command": "npx",
    "args": ["-y", "@modelcontextprotocol/server-postgres"],
    "env": {
    "DATABASE_URL": "postgresql://readonly:pass@analytics-host:5432/warehouse"
    }
    }
    }
    }

    When prompting your AI, specify which connection to use: "Using postgres-analytics, find the top 10 products by revenue last month."

    ---

    Practical Examples

    Once configured, here are real prompts that work:

    Schema Exploration

    Describe all tables in this database. What are the main entities and how do they relate?

    The AI will call list_tables then describe_table on each one, building a mental model of your data architecture.

    Writing Complex Queries

    I need a report: monthly active users (at least 1 login in the period) 
    for the last 12 months. Include month, count, and % change from prior month.

    The AI queries your schema, identifies the right tables and columns, writes the SQL, and executes it to verify the results.

    Migration Generation

    Look at the users table. I need to add soft deletes (deleted_at timestamp) 
    and an index on email + deleted_at. Write the Alembic migration.

    Debugging Performance

    This query is taking 8 seconds: [paste slow query]
    Look at the table definitions and tell me what indexes would help.

    The AI can examine the table structure and suggest targeted index additions.

    Data Validation

    Check the orders table for data quality issues: null values in required fields, 
    duplicate order IDs, and orders with total_amount < 0.

    ---

    Connecting to Remote Databases

    For databases hosted on AWS RDS, Supabase, PlanetScale (PostgreSQL-compatible), or similar:

    {
    "mcpServers": {
    "supabase": {
    "command": "npx",
    "args": ["-y", "@modelcontextprotocol/server-postgres"],
    "env": {
    "DATABASE_URL": "postgresql://postgres.xxxx:your_password@aws-0-us-east-1.pooler.supabase.com:6543/postgres"
    }
    }
    }
    }

    Supabase note: Use the Session Mode connection string (port 5432 or 6543) rather than Transaction Mode when using MCP, as the server maintains a persistent connection.

    AWS RDS: Ensure your security group allows inbound connections from your local IP on port 5432.

    Neon: Use the non-pooling connection string for MCP to avoid prepared statement conflicts.

    ---

    SSL Connections

    For databases requiring SSL:

    {
    "mcpServers": {
    "postgres-ssl": {
    "command": "npx",
    "args": [
    "-y",
    "@modelcontextprotocol/server-postgres",
    "postgresql://user:pass@host:5432/db?sslmode=require"
    ]
    }
    }
    }

    Common SSL modes:

  • sslmode=require — require SSL but don't verify certificate

  • sslmode=verify-full — require SSL and verify the server certificate (most secure)

  • sslmode=disable — no SSL (local dev only)
  • ---

    Troubleshooting Common Issues

    "Connection refused" error

    Check that PostgreSQL is running and accepting connections on the specified port:

    psql "postgresql://user:pass@localhost:5432/dbname" -c "SELECT 1"

    If this fails, the issue is the connection string, not MCP.

    "Permission denied for table"

    Your database user doesn't have SELECT permission on that table. Run:

    GRANT SELECT ON TABLE your_table TO your_mcp_user;

    Server not appearing in IDE after config change

    Cursor: Full restart (quit completely, reopen).
    Windsurf: Reload window (Cmd+Shift+P → Reload Window) or restart.

    Queries timing out

    Large table scans can hit MCP's default timeout. Add a statement timeout to your connection string:

    postgresql://user:pass@host:5432/db?options=-c%20statement_timeout%3D30000

    This sets a 30-second query timeout.

    SSL certificate errors

    Add ?sslmode=require to your connection string, or ?sslmode=no-verify for self-signed certificates in development.

    ---

    Alternative PostgreSQL MCP Servers

    The official @modelcontextprotocol/server-postgres is solid, but there are alternatives worth knowing:

    mcp-server-postgres (community) — adds write capabilities (INSERT/UPDATE/DELETE) behind a confirmation step. Use only in development.

    @supabase/mcp-server-supabase — Supabase's official MCP server. Connects via your Supabase project URL and service role key. Includes table management, Edge Function deployment, and project settings beyond raw SQL.

    @neon/mcp-server — Neon's MCP server with branching support. Create a database branch for every AI session so experiments don't affect production.

    ---

    Security Best Practices

    1. Always use a read-only user for MCP connections, even in development — habits matter.
    2. Never commit credentials to version control. Use environment variables.
    3. Restrict schema access — if the AI only needs the public schema, don't grant access to pg_catalog, information_schema, or internal schemas.
    4. Enable query logging in PostgreSQL to audit what the AI is querying: log_statement = 'all' in postgresql.conf.
    5. Set statement_timeout on the MCP user to prevent runaway queries.
    6. Use connection pooling (PgBouncer) in front of production databases to limit concurrent MCP connections.

    ---

    What to Do Next

    Once your PostgreSQL MCP server is running, try these setups:

  • Add a GitHub MCP server alongside it — now your AI can correlate code changes with database schema changes

  • Add a Jira MCP server — ask "which user stories this sprint touch the subscriptions table?"

  • Use the filesystem MCP server to give the AI access to your migration files alongside live schema data
  • The combination of code context + live database access is where AI-assisted development starts to feel genuinely superhuman.

    ---

    Want to connect other data sources? See our guides for Supabase MCP setup, AWS MCP in Cursor, and BigQuery MCP setup.