← Back to Articles
Continue.devMCPVS CodeJetBrainsopen-sourcesetup guidelocal AI

Continue.dev MCP Server Setup Guide (2026): Add MCP Tools to the Open-Source AI Coding Assistant

Complete guide to connecting MCP servers in Continue.dev — the open-source GitHub Copilot alternative. Works in VS Code and JetBrains. Covers config format, best servers, and troubleshooting.

By WebMCPGuide TeamMarch 26, 202610 min read


Continue.dev is the most popular open-source AI coding assistant — used by developers who want full control over their AI stack without a GitHub Copilot or Cursor subscription. It runs in VS Code and all JetBrains IDEs, supports any LLM backend (Claude, GPT-4, Ollama, Mistral, local models), and as of v0.9, has full MCP server support.

If you're using Continue with Claude or a local model and you haven't wired up MCP servers yet, this guide covers everything: the config format, the best servers to add, how Continue uses MCP differently than Cursor or VS Code Copilot, and how to troubleshoot the common failure modes.

Why Continue + MCP Is a Particularly Strong Combination

Most developers who use Continue are already self-hosting or using alternative LLMs. MCP makes that stack significantly more powerful by giving your AI — whether it's Claude, Llama 3, or a local Mistral instance — the ability to actually do things: query databases, search code, read files, manage issues.

Without MCP, Continue is an excellent autocomplete and chat tool. With MCP, it becomes an agent that can take action in your codebase and connected services.

A few things that make Continue's MCP implementation notable:

  • Works with any LLM, including local Ollama models — not just cloud APIs

  • Supports both VS Code and JetBrains from the same config file

  • Configuration lives in ~/.continue/config.json — a single file you can version-control and share across machines

  • MCP servers are model-agnostic — the same server works regardless of which LLM you point Continue at
  • Prerequisites


  • Continue extension installed in VS Code or a JetBrains IDE

  • Continue v0.9.0 or later — MCP support was added in 0.9. Check your version in the extension panel.

  • Node.js 18+ for npm-based MCP servers

  • Python 3.10+ if you're running Python-based MCP servers
  • Update Continue via your IDE's extension marketplace if you're on an older version.

    Understanding Continue's Config Structure

    All Continue configuration lives in ~/.continue/config.json (or ~/.continue/config.yaml if you prefer YAML). This is the single source of truth for your LLM providers, models, context providers, and — as of 0.9 — MCP servers.

    Here's a minimal config.json structure showing where MCP fits:

    {
    "models": [
    {
    "title": "Claude 3.5 Sonnet",
    "provider": "anthropic",
    "model": "claude-3-5-sonnet-20241022",
    "apiKey": "sk-ant-your-key"
    }
    ],
    "mcpServers": [
    {
    "name": "filesystem",
    "command": "npx",
    "args": ["-y", "@modelcontextprotocol/server-filesystem", "/Users/yourname"],
    "env": {}
    }
    ],
    "contextProviders": [],
    "slashCommands": []
    }

    The mcpServers array is where all your MCP configuration goes. Each entry needs at minimum a name, command, and args.

    Step-by-Step: Adding Your First MCP Server

    Step 1: Open config.json

    In VS Code with Continue installed:

  • Click the Continue sidebar icon

  • Click the settings gear at the bottom of the Continue panel

  • Select "Open config.json"
  • Or open it directly:

  • Mac/Linux: ~/.continue/config.json

  • Windows: %USERPROFILE%\.continue\config.json
  • Step 2: Add the mcpServers array

    If you don't already have an mcpServers key, add it alongside your models config:

    {
    "models": [ ... ],
    "mcpServers": []
    }

    Step 3: Add your first server

    Let's start with the GitHub MCP server — one of the most immediately useful for developers:

    {
    "mcpServers": [
    {
    "name": "github",
    "command": "npx",
    "args": ["-y", "@modelcontextprotocol/server-github"],
    "env": {
    "GITHUB_PERSONAL_ACCESS_TOKEN": "ghp_your_token_here"
    }
    }
    ]
    }

    Generate a GitHub personal access token at github.com/settings/tokens. For reading repos and issues, you need repo and read:org scopes. For writing (creating issues, comments), add write:discussion.

    Step 4: Reload Continue

    After saving config.json, reload the Continue extension:

  • Open the Command Palette (Ctrl/Cmd + Shift + P)

  • Run: Continue: Reload Window or just reload your IDE window
  • Step 5: Verify the connection

    In the Continue chat panel, ask:

    > "What tools do you have available from MCP?"

    Continue will list all connected MCP tools. If the server connected successfully, you'll see the GitHub tools listed (search_repositories, get_file_contents, create_issue, etc.).

    Complete Config Examples

    Filesystem + GitHub + Postgres

    {
    "models": [
    {
    "title": "Claude 3.5 Sonnet",
    "provider": "anthropic",
    "model": "claude-3-5-sonnet-20241022",
    "apiKey": "sk-ant-your-key"
    }
    ],
    "mcpServers": [
    {
    "name": "filesystem",
    "command": "npx",
    "args": [
    "-y",
    "@modelcontextprotocol/server-filesystem",
    "/Users/yourname/projects",
    "/Users/yourname/documents"
    ],
    "env": {}
    },
    {
    "name": "github",
    "command": "npx",
    "args": ["-y", "@modelcontextprotocol/server-github"],
    "env": {
    "GITHUB_PERSONAL_ACCESS_TOKEN": "ghp_your_token"
    }
    },
    {
    "name": "postgres",
    "command": "npx",
    "args": ["-y", "@modelcontextprotocol/server-postgres"],
    "env": {
    "POSTGRES_CONNECTION_STRING": "postgresql://localhost:5432/mydb"
    }
    }
    ]
    }

    Local Ollama Model + MCP

    This is the combination that makes Continue truly unique compared to Cursor or VS Code Copilot. You can run a fully local AI stack — no cloud APIs — with real MCP tool access:

    {
    "models": [
    {
    "title": "Llama 3.1 (Local)",
    "provider": "ollama",
    "model": "llama3.1:8b",
    "apiBase": "http://localhost:11434"
    }
    ],
    "mcpServers": [
    {
    "name": "filesystem",
    "command": "npx",
    "args": ["-y", "@modelcontextprotocol/server-filesystem", "/home/yourname"],
    "env": {}
    },
    {
    "name": "sqlite",
    "command": "npx",
    "args": ["-y", "@modelcontextprotocol/server-sqlite", "--db-path", "/home/yourname/data.db"],
    "env": {}
    }
    ]
    }

    Note on local models and tool use: Not all Ollama models support tool calling well. For reliable MCP tool use, use models specifically fine-tuned for function calling:

  • llama3.1:8b — Good tool use support

  • llama3.1:70b — Better, but requires significant hardware

  • mistral-nemo — Strong tool calling, lighter weight

  • qwen2.5-coder:7b — Excellent for code-focused tasks with tools
  • Avoid older models like llama2 or codellama for MCP — they predate modern tool-calling conventions and will produce inconsistent results.

    Python-Based MCP Server

    Some MCP servers are Python packages rather than Node modules. The config is similar — just change the command:

    {
    "mcpServers": [
    {
    "name": "custom-python-server",
    "command": "python",
    "args": ["-m", "my_mcp_server"],
    "env": {
    "MY_API_KEY": "your-key"
    }
    }
    ]
    }

    Or with uvx for Python package execution (similar to npx for Node):

    {
    "mcpServers": [
    {
    "name": "mcp-server-fetch",
    "command": "uvx",
    "args": ["mcp-server-fetch"],
    "env": {}
    }
    ]
    }

    Best MCP Servers for Continue.dev Workflows

    These are the servers with the highest practical impact for developers using Continue:

    1. GitHub MCP Server


    Package: @modelcontextprotocol/server-github
    Key tools: Search repos, read files, list issues/PRs, create issues, add comments
    Best prompt: "Find all open issues in this repo labeled 'bug' and summarize the most critical ones by impact"

    2. Filesystem Server


    Package: @modelcontextprotocol/server-filesystem
    Key tools: Read/write files and directories outside your current workspace
    Best prompt: "Read all .env.example files in ~/projects and create a consolidated list of all environment variables needed across my projects"

    3. Postgres / SQLite


    Package: @modelcontextprotocol/server-postgres or @modelcontextprotocol/server-sqlite
    Key tools: Execute queries, inspect schema, analyze data
    Best prompt: "Look at my users table schema and write a migration to add soft-delete support with deleted_at timestamp"

    4. Memory Server


    Package: @modelcontextprotocol/server-memory
    Key tools: Persistent storage that survives across Continue sessions
    Best prompt: "Remember that this project uses camelCase for all API responses and snake_case for the database. Apply this consistently going forward."
    Why it matters for Continue users: Unlike cloud-based tools with persistent threads, Continue sessions are ephemeral. The memory server gives you cross-session persistence.

    5. Brave Search


    Package: @modelcontextprotocol/server-brave-search
    Key tools: Real-time web search
    Best prompt: "Search for the current recommended way to handle streaming responses in Express.js and update this handler to follow that pattern"

    6. Custom Internal Servers


    This is where Continue really shines for teams. Because Continue is open-source and self-hosted, you can add MCP servers for internal tools — your company's internal APIs, proprietary databases, deployment systems — without routing data through a third-party cloud service.

    See the guide on how to build your first MCP server for a walkthrough of building a custom server.

    How Continue Uses MCP vs. Cursor and VS Code Copilot

    | Feature | Continue.dev | Cursor | VS Code Copilot |
    |---|---|---|---|
    | Config file | ~/.continue/config.json | ~/.cursor/mcp.json | settings.json (Copilot key) |
    | IDE support | VS Code + all JetBrains | VS Code only | VS Code only |
    | LLM flexibility | Any (local, cloud, custom) | Cursor models only | GitHub Copilot only |
    | Local model MCP | ✅ Full support (Ollama etc.) | ❌ Cloud only | ❌ Cloud only |
    | Open source | ✅ Fully open-source | ❌ Proprietary | ❌ Proprietary |
    | MCP transport | stdio, HTTP | stdio, HTTP | stdio |
    | Cost | Free (pay for LLM API only) | $20/mo | $10–$19/mo |

    The biggest differentiator: Continue is the only major AI coding assistant that supports MCP with local LLMs. If privacy, cost, or offline capability matters to you, this is a significant advantage.

    Troubleshooting Continue MCP Issues

    "MCP server failed to start"

    1. Test the command manually in your terminal:

    npx -y @modelcontextprotocol/server-github

    Should start and wait — kill with Ctrl+C



    2. Check Node.js version: node --version (need 18+)
    3. Check the Continue logs: Help → Toggle Developer Tools → Console tab in VS Code, or the Continue log file at ~/.continue/logs/core.log

    Tools don't appear after reload


  • Ensure mcpServers is at the top level of config.json, not nested inside models

  • Validate your JSON is syntactically correct (a missing comma breaks the whole file — use a JSON validator)

  • Try: Continue: Clear Cache from the command palette, then reload
  • Environment variables not being passed

    If your server needs env vars that aren't picking up:

    {
    "name": "github",
    "command": "npx",
    "args": ["-y", "@modelcontextprotocol/server-github"],
    "env": {
    "GITHUB_PERSONAL_ACCESS_TOKEN": "ghp_hardcoded_for_debugging"
    }
    }

    Hardcode temporarily to confirm the server works, then move back to environment variable references once you've confirmed it's an env issue.

    MCP works in VS Code but not JetBrains

    Continue uses the same ~/.continue/config.json for both IDEs. If it works in one but not the other:

  • Confirm both have the same Continue version installed

  • Restart the JetBrains IDE fully (not just reload)

  • Check if Node.js is in the PATH that JetBrains uses — JetBrains sometimes uses a different PATH than your terminal
  • ---

    FAQ

    Does Continue.dev MCP support work with all LLM providers?
    Yes — MCP tool support in Continue depends on whether the underlying model supports function/tool calling. Claude models have excellent MCP support. GPT-4o works well. For local Ollama models, use Llama 3.1, Mistral Nemo, or Qwen 2.5 Coder for reliable tool use.

    Is there a GUI for managing MCP servers in Continue?
    Not yet as of early 2026. Configuration is JSON-only. The Continue team has indicated a GUI config editor is on the roadmap.

    Can I share my Continue MCP config with my team?
    Yes — commit ~/.continue/config.json to a team dotfiles repo or internal wiki, but strip out secrets first. Use environment variable references ($MY_API_KEY) rather than hardcoded keys so the file is safe to share.

    Does MCP slow down Continue's response time?
    Only when tools are actively being called. Idle MCP servers add negligible overhead. Tool calls themselves add latency proportional to the operation (a database query might add 200–500ms; a web search 1–2 seconds).

    Can I use MCP servers from the Claude Desktop registry in Continue?
    If they're npm packages, yes — the same package works in any MCP-compatible client. HTTP-based servers work too if Continue supports HTTP transport (check your version's release notes).

    What's the difference between MCP context providers and Continue's native context providers?
    Continue has its own context provider system (the @codebase, @docs, @file syntax). MCP servers add tool use — the ability to take actions and retrieve real-time data. They're complementary: context providers feed information into prompts; MCP tools let the AI act on your behalf during a chat.