BigQuery MCP Server Cursor IDE Setup 2026: Query Your Data Warehouse with AI
Connect Google BigQuery to Cursor IDE using MCP. Run SQL, explore datasets, and analyze billions of rows from Composer — full setup guide with IAM config.
BigQuery MCP Server Cursor IDE Setup 2026
BigQuery holds the kind of data that shapes engineering decisions — event logs, product analytics, financial records, and more. Connecting it to Cursor via MCP means you can ask natural language questions against your data warehouse and get SQL results without leaving your editor. This guide covers the full setup.
What This Integration Enables
With BigQuery MCP running in Cursor, you can:
Prerequisites
Step 1: Set Up Google Cloud Authentication
The BigQuery MCP server uses Application Default Credentials (ADC). The easiest path:
gcloud auth application-default login
This opens a browser and stores credentials at ~/.config/gcloud/application_default_credentials.json. The MCP server picks these up automatically.
Alternatively, create a service account:
1. In Google Cloud Console, go to IAM & Admin → Service Accounts.
2. Click Create Service Account → name it "cursor-mcp".
3. Grant the role BigQuery Data Viewer (for read-only) or BigQuery User + BigQuery Data Editor (for queries + write).
4. Click Done, then open the account → Keys → Add Key → JSON.
5. Download the JSON key file and store it securely.
Step 2: Install the BigQuery MCP Server
The most widely used option is the community @dataengineeringwithalex/mcp-bigquery server:
npm install -g @dataengineeringwithalex/mcp-bigquery
Or run it with npx (no global install):
npx @dataengineeringwithalex/mcp-bigquery --help
Alternatively, there's a Python-based option:
pip install mcp-server-bigquery
Step 3: Configure Cursor
Edit ~/.cursor/mcp.json (macOS/Linux) or %APPDATA%\Cursor\mcp.json (Windows):
Using ADC (recommended):
{
"mcpServers": {
"bigquery": {
"command": "npx",
"args": ["-y", "@dataengineeringwithalex/mcp-bigquery"],
"env": {
"GOOGLE_CLOUD_PROJECT": "your-gcp-project-id"
}
}
}
}
Using a service account key:
{
"mcpServers": {
"bigquery": {
"command": "npx",
"args": ["-y", "@dataengineeringwithalex/mcp-bigquery"],
"env": {
"GOOGLE_CLOUD_PROJECT": "your-gcp-project-id",
"GOOGLE_APPLICATION_CREDENTIALS": "/absolute/path/to/service-account-key.json"
}
}
}
}
Replace your-gcp-project-id with your actual GCP project ID (visible in the Cloud Console header).
Step 4: Restart and Verify
Quit Cursor completely and reopen it. Go to Settings → Features → MCP and confirm the "bigquery" server appears with a green status.
In Composer, test with:
List the datasets available in my BigQuery project
If you get a list of datasets back, you're connected.
Real-World Use Cases
Exploratory Data Analysis
You're working on a feature that reads from a BigQuery table but you're not sure of the schema:
Describe the schema for the events.user_sessions table in BigQuery
Show me a sample of 10 rows from events.user_sessions
where created_at is in the last 7 days
Debugging a Data Pipeline
Your dbt model is producing unexpected output. Ask Cursor:
Query BigQuery: SELECT COUNT(*) as total, status, COUNT(DISTINCT user_id)
FROM analytics.orders
WHERE created_at >= '2026-04-01'
GROUP BY status
ORDER BY total DESC
Validating a Migration
Before deploying a schema change:
Check how many rows in bigquery table analytics.events
have a null value in the session_id column
Writing Queries Based on Business Questions
Write a BigQuery SQL query that shows me daily active users
for the past 30 days from the analytics.events table,
using the user_id field and event_date timestamp
Cursor writes the SQL, then you can run it directly in the MCP interface.
Troubleshooting
Authentication errors
gcloud auth application-default print-access-token — if it errors, re-run gcloud auth application-default login.GOOGLE_CLOUD_PROJECT matches exactly what's in your GCP console — it's case-sensitive."Access Denied" on specific tables
Query timeout
"BIGQUERY_TIMEOUT": "60" to your env block.LIMIT clause to your prompt.Server starts but no tools appear
GCLOUD_PROJECT instead of GOOGLE_CLOUD_PROJECT. Try both.npx @dataengineeringwithalex/mcp-bigquery and check for startup errors.Results are truncated
LIMIT 100 explicitly in your prompts for exploratory queries.Cost Awareness
BigQuery charges per bytes scanned. When running queries through MCP:
SELECT specific_columns instead of SELECT * to reduce bytes scanned.SELECT COUNT(*) before running wide queries.A well-structured AI query should cost less than $0.01. Unfiltered table scans on terabyte datasets can be expensive.
Using Multiple GCP Projects
If you work across multiple GCP projects, configure separate MCP servers:
{
"mcpServers": {
"bigquery-prod": {
"command": "npx",
"args": ["-y", "@dataengineeringwithalex/mcp-bigquery"],
"env": {
"GOOGLE_CLOUD_PROJECT": "my-prod-project"
}
},
"bigquery-dev": {
"command": "npx",
"args": ["-y", "@dataengineeringwithalex/mcp-bigquery"],
"env": {
"GOOGLE_CLOUD_PROJECT": "my-dev-project"
}
}
}
}
Then reference them explicitly: "Query bigquery-dev for the orders table schema."
Security
mcp.json — add it to .gitignore.BigQuery Data Viewer) unless you need to write or create tables.Combining with Other MCP Servers
BigQuery pairs naturally with:
See the full MCP server directory to expand your AI-powered data workflow.