After the latest update, ChatGPT has been severely dumbed! It’s almost useless. The output tokens have decreased to less than 1000 tokens. I cannot make him produce bigger content. If they don’t fix this asap, they will lose 90% of their subscribers. That’s insane.
What are you talking about? It’s doing fine. That sounds like a prompting issue, because ChatGPT on my end has seen absolutely no performance degradation whatsoever, and I’m using it ~8 hrs a day.
I don’t believe you. It’s probably that you use it for simple tasks and you don’t need large token outputs. I have been using it since day one and it’s totally unacceptable. Context size has been greatly reduced, I cannot make him return 1k token output. It replies super short and misses at least half the details. Try giving him a big document and ask for a summary. I tested Google’ chat-bison model on a document and it returned 1800 tokens of summary. Same document and prompt with ChatGPT and I got 600 tokens. Complete joke.
It’s probably that you use it for simple tasks and you don’t need large token outputs.
LOL
I do not use ChatGPT for “simple tasks”. In fact, the complexity with which I use it (which, btw, I have also been using since day 1) has allowed me to become an active and helpful contributor on this forum to teach others how to write effective prompts. I am not here to argue, I am here to help. I am not trying to shame you, I am simply trying to tell you that your issue is likely due to the way in which you’re prompting the model.
Now, are you going to be argumentative, or do you actually want help? I think this should demonstrate enough.
I agree that you are an active contributor but you are definitely not helpful
You just provided screenshots that proved me right. Your output is 700 tokens. If you were really proficient, you would have checked that.
The average lengths of outputs was shortened a while ago. Now, it will go up to around 1024 tokens, and a button will appear if you want to continue the text.