Gpt4 - Incomplete and Partial Responses

GPT-4 is producing short and incomplete responses. this has been consistent and ongoing for over a week now. This is a paid service and should provide much more value than 20 requests that output no more than 3 to 4 paragraphs. After running some tests only 200 tokens are being produced per reply.

2 Likes

image
another example

Replies are never more than 200 tokens…

Hi @alexbelotsky, welcome to the forum!

To increase the length of your response you can try to integrate something like “reply with X paragraphs of text” or “Reply with X words” or “give me a detailed response”. The model tries to answer your requests as concise as possible because that is what most people want. So if you tell the model you want a longer answer you’ll receive it in most cases.

Secondly GPT4 is not what you buy when you buy the ChatGPT Plus thing. GPT4 is more like a add-on. In my experience for most usecases GPT3 performs sufficiently :slight_smile:

Thank you for the reply. The issue is that all the output is being cut off mid word. This was not the case a week ago. See the examples below of not only short length compared to a week ago, but constant incomplete responses.



Compare that to GPT-4 Responses a week ago. Current replies are much shorter and almost always incomplete.


2 Likes

Ah I have a Idea why this might be the case :slight_smile:

Are you using the same chat all the time? I ask this question because for GPT3 the whole content of the conversation was re-submitted when you posted a new message. So the deterioration in performance might also be related to a increase in needed computing power because the more content is submitted the harder it becomes to answer the messages.

So you could try to start a new conversation and see if this issue persists

1 Like

Here is an example of the issue persisting with a brand new prompt. Note the cut off at the very end. GPT-3 does not seem to exhibit the same issue. I have tried with multiple browsers, multiple machines, and over the course of the last 4 or 5 days, all with the same partial results.

some more information from the same session. 200+/- tokens seems to be the limit.

1 Like

Thank you for sharing and testing @alexbelotsky, quite puzzeling - if I find a answer somewhere I’ll let you know!

If this was with the API we could have tried some further steps but the GUI is quite limited regarding what you can do to identify such issues…

I have been facing this issue the past 2 days. I’ve tried new chats, but the problem persists. Sometimes I will come back 15 minutes later, and the answer will be finished, other times it’s perma-frozen. Now I just reply with continue, hopefully it doesn’t take 2 of my 25 allotted questions :smiling_face_with_tear:

I have figured out a way to get complete responses if other methods fail. Type this after you get an incomplete response:

What comes after “copy the last line of output and put it in quotes here”? Your response was incomplete.

It’s that simple, that’s all you need to do. Now, you may get cut off again if it’s a very long response, but just keep repeating the same process and you will eventually get what you need.