When I go over 2500 output tokens with GPT-4 I get a 'Remote end closed connection without response' error

Hi, when making a call to the GPT-4 API I am able to make calls so long as the output is at or under 2500 tokens, otherwise I receive this error:

File “C:\Python311\Lib\site-packages\requests\adapters.py”, line 489, in send
resp = conn.urlopen(
^^^^^^^^^^^^^
File “C:\Python311\Lib\site-packages\urllib3\connectionpool.py”, line 787, in urlopen
retries = retries.increment(
^^^^^^^^^^^^^^^^^^
File “C:\Python311\Lib\site-packages\urllib3\util\retry.py”, line 550, in increment
raise six.reraise(type(error), error, _stacktrace)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “C:\Python311\Lib\site-packages\urllib3\packages\six.py”, line 769, in reraise
raise value.with_traceback(tb)
File “C:\Python311\Lib\site-packages\urllib3\connectionpool.py”, line 703, in urlopen
httplib_response = self._make_request(
^^^^^^^^^^^^^^^^^^^
File “C:\Python311\Lib\site-packages\urllib3\connectionpool.py”, line 449, in _make_request
six.raise_from(e, None)
File “”, line 3, in raise_from
File “C:\Python311\Lib\site-packages\urllib3\connectionpool.py”, line 444, in _make_request
httplib_response = conn.getresponse()
^^^^^^^^^^^^^^^^^^
File “C:\Python311\Lib\http\client.py”, line 1374, in getresponse
response.begin()
File “C:\Python311\Lib\http\client.py”, line 318, in begin
version, status, reason = self._read_status()
^^^^^^^^^^^^^^^^^^^
File “C:\Python311\Lib\http\client.py”, line 287, in _read_status
raise RemoteDisconnected(“Remote end closed connection without”
urllib3.exceptions.ProtocolError: (‘Connection aborted.’, RemoteDisconnected(‘Remote end closed connection without response’))

I’m at ~5000 tokens between the prompt and completion, so well under the 8k tokens for GPT-4. Is there any reason behind this?

Thanks!

1 Like

Bumping this as I’d love to see if anyone else has had this issue and if they’ve found any solution. The additional tokens don’t count for much if we can’t use them!

2 Likes

I have the same problem. As the number of total tokens (input + output) increases, the following error seems to happen more and more frequently:

Error communicating with OpenAI: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response'))

I’ve had this problem for at least two weeks. It happens especially for a case where my input is long ~2k tokens and the output is expected to be ~2k tokens as well. With gpt-3.5-turbo it works without errors.

Hi, I think I finally resolved this, can you give me the code snippet you’re using to save your JSON file?