Is the OpenAI Down on an API layer?

Hey anyone else’s service getting: OpenAI error: Request failed with status code 429 The server is currently overloaded with other requests. Sorry about that! You can retry your request, or contact support@openai.com if the error persists

Since about 6 hours ago?

2 Likes

Not only that I am getting error with file upload and use too.

Could you send us a bit more details? Which model were you calling, and did the error persist?

I’d like to make sure that this is a momentary blip rather than a sign of a larger error

I think I did everything correctly but the file does not get recognized even though it is there!

My Error Here:

File Upload
image

Error

openai.Answer.create(
              search_model="ada", 
              model="curie",
              question='What is the phone number?', 
              file= "file-UDmoVdYhnBTISxlm7Z30q3cT",
              examples_context="In 2017, U.S. life expectancy was 78.6 years.", 
              examples=[["What is human life expectancy in the United States?","78 years."]],
              max_tokens=15,
              stop=["\n", "<|endoftext|>"],
            )
---------------------------------------------------------------------------
InvalidRequestError                       Traceback (most recent call last)
<ipython-input-8-9c9716857a91> in <module>
      7               examples=[["What is human life expectancy in the United States?","78 years."]],
      8               max_tokens=15,
----> 9               stop=["\n", "<|endoftext|>"],
     10             )

D:\Anaconda\envs\Tensorflow2\lib\site-packages\openai\api_resources\answer.py in create(cls, **params)
     12     def create(cls, **params):
     13         instance = cls()
---> 14         return instance.request("post", cls.get_url("answers"), params)

D:\Anaconda\envs\Tensorflow2\lib\site-packages\openai\openai_object.py in request(self, method, url, params, headers, stream, plain_old_data)
    242         )
    243         response, stream, api_key = requestor.request(
--> 244             method, url, params, headers, stream=stream
    245         )
    246 

D:\Anaconda\envs\Tensorflow2\lib\site-packages\openai\api_requestor.py in request(self, method, url, params, headers, stream)
    130             method.lower(), url, params, headers, stream=stream
    131         )
--> 132         resp = self.interpret_response(rbody, rcode, rheaders, stream=stream)
    133         return resp, stream, my_api_key
    134 

D:\Anaconda\envs\Tensorflow2\lib\site-packages\openai\api_requestor.py in interpret_response(self, rbody, rcode, rheaders, stream)
    356             )
    357         else:
--> 358             return self.interpret_response_line(rbody, rcode, rheaders, stream)
    359 
    360     def interpret_response_line(self, rbody, rcode, rheaders, stream=False):

D:\Anaconda\envs\Tensorflow2\lib\site-packages\openai\api_requestor.py in interpret_response_line(self, rbody, rcode, rheaders, stream)
    376         if stream_error or not 200 <= rcode < 300:
    377             raise self.handle_error_response(
--> 378                 rbody, rcode, resp.data, rheaders, stream_error=stream_error
    379             )
    380 

InvalidRequestError: No similar documents were found in file with ID 'file-UDmoVdYhnBTISxlm7Z30q3cT'.Please upload more documents or adjust your query.

Hi just updating

I spoke to the team directly on support chat. There was a latency issue on the OpenAI side for the content verify request which was causing the issues

They resolved it in 2 hours!

I just tried the example I gave before but it does not seem to work.
If I got any mistakes on my part plz lemme know thanks!