Gpt4 - Incomplete and Partial Responses

GPT-4 is producing short and incomplete responses. this has been consistent and ongoing for over a week now. This is a paid service and should provide much more value than 20 requests that output no more than 3 to 4 paragraphs. After running some tests only 200 tokens are being produced per reply.

image
another example

Replies are never more than 200 tokens…

Hi @alexbelotsky, welcome to the forum!

To increase the length of your response you can try to integrate something like “reply with X paragraphs of text” or “Reply with X words” or “give me a detailed response”. The model tries to answer your requests as concise as possible because that is what most people want. So if you tell the model you want a longer answer you’ll receive it in most cases.

Secondly GPT4 is not what you buy when you buy the ChatGPT Plus thing. GPT4 is more like a add-on. In my experience for most usecases GPT3 performs sufficiently :slight_smile:

Thank you for the reply. The issue is that all the output is being cut off mid word. This was not the case a week ago. See the examples below of not only short length compared to a week ago, but constant incomplete responses.



Compare that to GPT-4 Responses a week ago. Current replies are much shorter and almost always incomplete.


1 Like

Ah I have a Idea why this might be the case :slight_smile:

Are you using the same chat all the time? I ask this question because for GPT3 the whole content of the conversation was re-submitted when you posted a new message. So the deterioration in performance might also be related to a increase in needed computing power because the more content is submitted the harder it becomes to answer the messages.

So you could try to start a new conversation and see if this issue persists