The Inconvenient Truth about Error 524 and PHP API Connections

I recently encountered the “Error 524: a timeout occurred” on a server.

I learned that this error means that Cloudflare was able to connect to the origin web server, but the server failed to send an HTTP response before the connection timed out after 100 seconds.

Unfortunately, there is no known way to fix this issue other than upgrading to Cloudflare Enterprise. This means that it is not possible to establish a reliable PHP API connection on a Cloudflare-secured server. No?

EDIT:
Should have specified the problem here: This only matters when receiving very long responses from the API that take a lot of time to complete, such as 2000+ words in one request.

Hey, What do you mean by a “PHP API” connection? The OpenAI API communicates over HTTP. This isn’t specific to a language and is widely supported in every programming language. Chances are, it’s just a configuration issue on your end. Would you be able to share some code?

Do you encounter this error when the request is made of > 1000 tokens?

If Cloudflare does not receive a response within 100 seconds, it times out with a 524 error. This timeout is not the same as the classic 502 error of your Apache server. You could gray out a subfolder on the Cloudflare dashboard, but that would reveal your server’s IP address.

It only depends on how long it takes the API to produce the results. You can change settings in php.ini and your Apache server to avoid timeouts on your site, but it won’t stop a 524 error on the Cloudflare site.

Any API integration that requires more than 100 seconds to resolve is either a poorly constructed integration or it needs a different approach that includes web worker jobs that can be queried and examined for completion progress. I don’t think the OpenAI API is designed to accommodate long-running processes. But it is designed to stream back results where they are sizeable.

Using this approach, you need to sustain the request promise while looking for the [DONE] message.

I believe that by setting the stream to true will cause Cloudflare to see the activity and not timeout despite the process taking more than 100 seconds.

Won’t work reliably… I tried it. I went back to not streaming, changed my server and PHP settings, and stopped using Cloudflare. Now it works perfectly. But maybe you can find a solution. In my opinion, you can’t manipulate the data in CURLOPT_WRITEFUNCTION directly to get the part after “data:”. The stream return is not a JSON.

My code for streaming curl - works btw:

function get_chat_completion($api_key, $model, $message, $temperature, $max_tokens, $top_p, $frequency_penalty, $presence_penalty, $stop) {
$url = ‘https://api.openai.com/v1/chat/completions’;
$system = “You are a helpful assistant. Do not remind me what I asked you for. Do not apologize. Do not self-reference.”;
$data = array(
‘model’ => $model,
‘temperature’ => $temperature,
‘max_tokens’ => $max_tokens,
‘top_p’ => $top_p,
‘frequency_penalty’ => $frequency_penalty,
‘presence_penalty’ => $presence_penalty,
‘stop’ => $stop,
‘stream’ => true,
‘messages’ => array(
array(
‘role’ => ‘system’,
‘content’ => $system
),
array(
‘role’ => ‘user’,
‘content’ => $message
)
)
);
$headers = array(
‘Content-Type: application/json’,
'Authorization: Bearer ’ . $api_key
);
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_HTTPHEADER, $headers);
curl_setopt($ch, CURLOPT_POST, true);
curl_setopt($ch, CURLOPT_POSTFIELDS, json_encode($data));
curl_setopt($ch, CURLOPT_TIMEOUT, 300);
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 300);
curl_setopt($ch, CURLOPT_FORBID_REUSE, 300);
curl_setopt($ch, CURLOPT_FRESH_CONNECT, 300);

// Use php://memory to store the response
$response_buffer = fopen('php://memory', 'w+');
curl_setopt($ch, CURLOPT_WRITEFUNCTION, function($ch, $data) use ($response_buffer) {
    # write the response to the buffer
    fwrite($response_buffer, $data);
    return strlen($data);
});

$response = curl_exec($ch);
if($response === false){
    throw new Exception(curl_error($ch), curl_errno($ch));
}
        curl_close($ch);
        
        // Get the response from the buffer
        rewind($response_buffer);
        $response = stream_get_contents($response_buffer);
        fclose($response_buffer);
        return $response;           
    }
1 Like

Hey, I am using curl and CURLOPT_WRITEFUNCTION and getting this result:
data: {"id":"chatcmpl-6we3uupSaAJg7P7RBbw47YvSKIQCG","object":"chat.completion.chunk","created":1679435042,"model":"gpt-4-0314","choices":[{"delta":{"content":" J"},"index":0,"finish_reason":null}]}

any idea how to get put it to array or valid json and get content?
thanks a lot, spent hours on it :frowning:

It looks like you are using streaming

You will get lots of messages in this format

The text you need is in choices[0].delta.content (or similar - depending on your language)

Each one is a single token

You need to join them all together to get the final answer

1 Like

I am getting the correct result, but the problem is, that its coming as string and I have now idea how to parse it and get result from {"content":" J"}. thanks a lot for advice

1 Like

OK, you need to remove the “data:” prefix from the string

Then you can convert it to a JSON object. The data: prefix stops this from happening and is because you are streaming

Once you have a JSON object, you can get to the choices array and pick out the delta.content value

1 Like

yeah thanks a lot, I figured it out and it works like a charm!

1 Like

Thanks.
I have used you solution but unfortunately I still get the 524 error (cf_gateway_timeout) for each long request after 10 minutes (I changed the timeout settings from 300 to 1200 seconds). Any suggestions.

To be clear - my app doesn’t use CF - the error is generated by OpenAI’s CF. Do you have any suggestions.

It’s very disappointing that GPT4 “supports” 8K requests and it fails on almost every >3k request :confused:

Any news regarding this?

Getting this error whenever I send more than 2000 tokens to the completion API !!! (GPT 4)

1 Like

Create a message, send it to rabbitmq and make it a backgroundtask for another server that has no relation to cloudflare at all?
And then use a websocket to send the answer?