Initial Setup
Setting Up the Tool
Defining the Tool Schema
"strict": True
is set inside the function
object in the tool schema.Making the API Call
parallel_tool_calls=False
when using tool calling with llama-4-scout-17b-16e-instruct
. The model doesn’t currently support parallel tool calling, but a future release will.Handling Tool Calls
messages
, then ask the model to continue.client.chat.completions.create()
until you get a message without tool_calls
.llama-3.3-70b
model. This model will error if you include a non-empty tool_calls
array on an assistant turn.
For llama-3.3-70b
, make sure your assistant response explicitly clears its tool_calls
like this: