I am now getting the following error in the chatgpt 3.5 turbo API. I do not find any documentation about this specific error in the docs online and it is not very descriptive. Note that I have not changed my code and it has just started returning this about 2 hours ago, so it must be an issue with the OpenAI endpoint.
error_code=None error_message=“Detected an error in the prompt. Please try again with a different prompt.” error_param=prompt error_type=invalid_request_error message=“OpenAI API error received” stream_error=False
Welcome to the forum.
What is the prompt? Can we reproduce the bug? Maybe it’s the content filter tripping? Have you checked the prompt with the moderation endpoint?
If the issue persists, contact our support team via chat and provide them with the following information:
- The model you were using
- The error message and code you received
- The request data and headers you sent
- The timestamp and timezone of your request
- Any other relevant details that may help us diagnose the issue
Our support team will investigate the issue and get back to you as soon as possible. Note that our support queue times may be long due to high demand. You can also post in our Community Forum but be sure to omit any sensitive information.
It is a basic summarization prompt that also passes in the context to summarize. And yes I do pass in the total prompt to the moderation endpoint and it passes there. I have been using the same pipeline flow for months, so no issues there. The error is only return sporadically.