OpenAI request timeout

Hey!

I am currently running some tests against the classification endpoint using the OpenAI library. Every few requests, a request starts taking a lot of time, seconds (if not minutes). I would like to timeout the request in that case after a few seconds. I’ve found a similar thread here.
The proposed solution does not work since the timeout parameter is not there anymore.
Does anyone know a simple solution to this problem except for using the REST API?
In addition to that, is there anything from OpenAIs side that might cause the request taking so long?

Thanks for your help!

3 Likes

I have this same error and it has happened 2x today, resulting in me getting billed for use while not actually returning any values. Is there a recommendation to stairstep or chunk the requests?

1 Like

I’m getting the same issue after extracting a couple hundred sentence embeddings.

Hi @venia

I built some backoff functionality into my API call. See below. This code will implement retries, and then group your dataframe rows into chunks, and then feed the chunks into the API. I had initially tried with 36K rows in my dataframe and it timed out about halfway through. This has been working for me.

import time
from retry import retry

import time
from retry import retry

@retry(Exception, tries=5, delay=1)
def classify_sentiment(text):  
    model_engine = "text-davinci-003"
    prompt = f"classify the sentiment of this text as Positive, Negative, or Neutral: {text}\nResult:"
    
    completions = openai.Completion.create(
        engine=model_engine,
        prompt=prompt,
        max_tokens=64,
        n=1,
        stop=None,
        temperature=0.1,
    )

    message = completions.choices[0].text
    sentiment = message.strip().split(" ")[-1]
    return sentiment

def classify_sentiment_chunk(text_chunk):
    results = []
    for text in text_chunk:
        sentiment = classify_sentiment(text)
        results.append(sentiment)
    return results

def classify_sentiment_dataframe(df, chunk_size=1000):
    results = []
    for i in range(0, len(df), chunk_size):
        text_chunk = df['light_clean'][i:i+chunk_size]
        results.extend(classify_sentiment_chunk(text_chunk))
    return results


1 Like

I’m using an exponential retry strategy with a stop condition that it should stop after 27 seconds (with tenacity), and I still get timeouts or openai.error.APIConnectionError.