Error message with longer inputs

Using an API call, when I input a shorter prompt, I don’t get the following error. I do get the error when I input a longer prompt. I’ve increased the tokens so the figure is far above my input length. Know what the solution is? Here’s the error:

The service Name-of-my-API-call just returned an error (HTTP 400). Please consult their documentation to ensure your call is setup properly. Raw error:

{
"error": {
"message": "We could not parse the JSON body of your request. (HINT: This likely means you aren't using your HTTP library correctly. The OpenAI API expects a JSON payload, but what was sent was not valid JSON. If you have trouble figuring out how to fix this, please send an email to support@openai.com and include any relevant code you'd like help with.)",
"type": "invalid_request_error",
"param": null,
"code": null
}
}

Is it over 2,000 tokens in total?

Weird it would show a JSON error message for the larger one, though…

Good luck.

Hi there, running into the same problem. Have you found a solution?

Many thanks!

1 Like

also have the same problem, works completely fine, but it commonly happens after 1 or two generations (that arent even that long)

Did anyone resolve this issue? I am getting it occasionally with both ChatGPT 3.5 Turbo and with Davinci. Both work regularly. But sometimes the exact same prompt will return the error mentioned above. Any tips appreciated!

This appears to still be an issue, it’s becoming pretty frustrating as my entire web app sorta depends on this!

For context, I have similar prompts being sent to the API; one using GPT 3.5 and GPT 4o for paid users. Both requests will intermittently produce the error shown at the top of this thread, however sometimes it will work just fine even with the exact same prompt.

There’s clearly no issue with the payload being JSON - otherwise it wouldn’t work at all.

I’ve tried setting max_tokens on the requests to something like 1000 tokens to make sure we’re getting nowhere close to the 4096 (apologies if that’s not how this works, I’m learning all this as I go!).

Has anybody found a solution to this issue?

Thank you :pray: