BatchAPI Error: "Invalid type for 'temperature': expected a decimal, but got a string instead."

Since yesterday evening, I’ve been encountering the error “Invalid type for ‘temperature’: expected a decimal, but got a string instead.” when trying to run a batch job on the API.

My jsonl file contains items of the following format (with placeholders for id and messages):
{“custom_id”: “custom_id”, “method”: “POST”, “url”: “/v1/chat/completions”, “body”: {“model”: “gpt-3.5-turbo”, “messages”: [list_of_messages], “temperature”: 0.8, “max_tokens”: 512, “top_p”: 1, “frequency_penalty”: 0, “presence_penalty”: 0}}

I tested following aspects:

  1. Verifying the structure of my file with JSONLint did not raise any error or indicated an invalid json structure
  2. Converting the number explicitly to float and queueing a job with a single instance
  3. Queueing a batch job with a jsonl file that successfully completed yesterday and it started raising the error too

None of the attempted fixes changed the outcome of the batch job. The error started appearing yesterday evening (CEST) without any change in my source code or the way I submit batches.

4 Likes

Got the same error, does anyone know what the cause is?

2 Likes

I have the exact same problem. Yesterday, I was sending batch jobs without any issues. Suddenly, in the evening, I started to get that error for all my requests. Nothing in my code changed; the temperature in my request input objects is set to a decimal.

3 Likes

Same here… using batch API. About 36 hours ago it worked just fine.

3 Likes

same here… having the same error

1 Like

I had to take temperature out for it to work e

Same here. Looks like a bug.

Also I want more TPD

1 Like

:expressionless: it will then be default value 1.0 I guess?

Yeah, I would assume so. Which is quite the constraint for the moment, so the only feasible option is to fall back on normal API calls (which takes longer and is more expensive) until the issue is fixed

1 Like

Something I do not want to do, need to send around 15 million tokens. Hope they fix this quickly

2 Likes

Thanks for reporting, we’re looking into this now.

6 Likes

Fix incoming, status page is up: OpenAI Status - Some Batch API Calls Are Failing

2 Likes
2 Likes

Fix is deployed, this should be resolved for requests that haven’t been processed yet. If requests were already processed and failed, you’ll see them as 400s in the error file once your batch completes. You won’t be billed for failed requests.

Apologies for the disruption, and thanks again for flagging!

7 Likes