Skip to main content
To get started with a free API key, click here.
The Cerebras Code MCP Server accelerates code generation in your existing IDEs and CLI tools (like Cursor and Claude Code) by up to 20x compared to GPUs. It leverages Cerebras fast inference through the Model Context Protocol and offers optional graceful fallback via OpenRouter. MCP provides an open standard that enables AI models to securely interact with tools, data, and editors. Rather than limiting models to plain text chat, MCP grants them structured access to external systems like your IDE, allowing them to read, write, and modify code with consistent rules. This approach ensures safer, more reliable, and more predictable model-driven coding.
The Cerebras Code MCP server is currently in research preview and is open source here. We welcome contributions!
1

Set up your API key

You need a valid Cerebras API key. Please visit this link and sign up, then click on API Keys in the left navigation.
Optionally, create an OpenRouter key here to use as fallback if you hit Cerebras rate limits.
2

Install the NPM Package

Open your preferred IDE and run the following code in your terminal:
npm install -g cerebras-code-mcp
3

Run the Setup Wizard

In your terminal, run:
cerebras-mcp --config
This will begin the setup process where you can configure different editors (like Claude Code) and set API keys for Cerebras and OpenRouter.
Run the following to verify the setup:
claude mcp list
The output should look like this:
Checking MCP server health...

cerebras-code: cerebras-mcp  - Connected  

Available Models

The Cerebras Code MCP Server supports all Cerebras models:
ModelParametersBest For
llama-3.3-70b70BBest for complex reasoning, long-form content, and tasks requiring deep understanding
qwen-3-32b32BBalanced performance for general-purpose applications
llama3.1-8b8BFastest option for simple tasks and high-throughput scenarios
gpt-oss-120b120BLargest model for the most demanding tasks
zai-glm-4.7357BAdvanced 357B parameter model with strong reasoning capabilities
Configure your preferred model during the setup wizard or in your MCP configuration file.