Configuring OpenAI to Use Cerebras API
To start using Cerebras with OpenAI’s client libraries, simply pass your Cerebras API key to theapiKey parameter and change the baseURL to https://api.cerebras.ai/v1:
Developer-Level Instructions via System and Developer Roles
This info is only applicable to the
gpt-oss-120b model. gpt-oss-120b, the API supports both the system and developer message roles. Both are mapped to a developer-level instruction layer in the prompt hierarchy, elevated above normal user instructions and injected into the model’s internal system prompt. This gives you significant control over the assistant’s tone, style, and behavior while preserving the model’s built-in safety guardrails.
The developer role is functionally equivalent to system – the system role remains supported for backwards compatibility.
Key Differences from OpenAI
OpenAI’s API distinguishes betweensystem and developer roles with different behavior. On Cerebras, both roles act at the developer level, meaning they may have stronger influence than system messages in OpenAI’s API.
As a result, the same prompt may yield different behavior here compared to OpenAI. This is expected.
Passing Non-Standard Parameters
- OpenAI: Non-standard parameters (e.g.,
clear_thinkingfor Z.ai GLM) need to be passed throughextra_body. Standard OpenAI parameters likereasoning_effortwork directly. - Cerebras SDK: Non-standard parameters can be passed in either
extra_bodyor as regular parameters likemodel.
Example: Using the OpenAI Client
Example: Using the OpenAI Client
When using the OpenAI client with Cerebras API, non-standard parameters must be passed through
extra_body:Example: Using the Cerebras SDK Client
Example: Using the Cerebras SDK Client
When using the Cerebras SDK client, non-standard parameters can be passed as regular parameters:

