Configuring OpenAI to Use Cerebras API
To start using Cerebras with OpenAI’s client libraries, simply pass your Cerebras API key to theapiKey
parameter and change the baseURL
to https://api.cerebras.ai/v1:
Developer-Level Instructions via System Role
This info is only applicable to the
gpt-oss-120b
model. gpt-oss-120b
, our API maps the system
role to a developer-level instruction layer in the prompt hierarchy. When you send messages with role: "system"
, these are elevated above normal user instructions and injected into the model’s internal system prompt. This gives you significant control over the assistant’s tone, style, and behavior while preserving the model’s built-in safety guardrails.
Key Differences from OpenAI
OpenAI’s API distinguishes betweensystem
and developer
roles. Our implementation does not expose developer
directly. Instead, your system messages act at the developer level, meaning they have stronger influence than in OpenAI’s API.
As a result, the same prompt may yield different behavior here compared to OpenAI. This is expected.
Currently Unsupported OpenAI Features
Note that although Cerebras API is mostly OpenAI compatible, there are a few features we don’t support just yet: Text CompletionsThe following fields are currently not supported and will result in a 400 error if they are supplied:
frequency_penalty
logit_bias
presence_penalty
parallel_tool_calls
service_tier
While Cerebras supports a
stream
parameter, note that JSON mode is not compatible with streaming.