Setting temperature value in Batch API

Hi everyone,
I am preparing my .jsonl file for my Batch API requests.
In doing so, I was wondering if there is any way I can set the temperature value in each of my requests.
Here an example from the Batch API documentation:
{“custom_id”: “request-1”, “method”: “POST”, “url”: “/v1/chat/completions”, “body”: {“model”: “gpt-3.5-turbo-0125”, “messages”: [{“role”: “system”, “content”: “You are a helpful assistant.”},{“role”: “user”, “content”: “Hello world!”}],“max_tokens”: 1000}}

Is there a way to add information in the .jsonl file as to which temperature value should be set for the processing of my request?
Thank you in advance for your help!

Welcome to the Community!

Sure, you can specify the temperature just like you would normally include the parameter in an API call, i.e. just add “temperature” before or after the max_token parameter in your case.

The temperature value itself depends entirely on your needs.

3 Likes