Custom GPTs - Are now completely unusable

I’ve been developing Custom GPTs for several months, and my experience has gone from impressive and exciting to disheartening and frustrating. This is really just a rant because I don’t expect anything to get better. It seems like OpenAi is consistently reducing the amount of compute and capabilities given to custom GPTs. This results in an intolerable level of consistency and performance. I have several GPTs that I’ve been on the edge of making public only to backtrack after performing final testing and refinement. These GPTs used to be 90% ish reliable and performant but are now 30-50%, even after simplifying rather than enhancing their system instructions. I will probably switch efforts to more heavily using the API, but the loss of time and money is depressing. I got locked into a year long subscription for a ChatGPT Team account that seem like a total waste, at this point.


I reported a similar issue and I was told that they are aware and they are working towards a solution.

1 Like

I support your rant and share in your frustration. In the early months our GPT’s were able to perform ANYTHING we asked. Since the release it has become watered down and the trending ones are no sh** ideas that barely work as well as those with 100-1000 uses. @SamAltman can you bring back the way it was around release? Nah ? All good…


I am sure they will fix the problem. I have trust in the team at OpenAI. Let’s be patient.

1 Like

Check out my profile stickied topic. After some troubleshooting I was able to fix. Hope it helps!

Thanks for your response. Your troubleshooting guide looks like a good resource to use following periodic system issues. However the issue I am experiencing is a persistent and universal (across all of my custom GPTs) reduction in reliability and performance. It is very clear to me that ChatGPT and more specifically custom GPTs have been receiving less compute since mid to late January. OpenAI’s focus is not on its currently released products or customers. They have limited compute available and are trying to juggle it while keeping customers minimally appeased. I’m not wasting time being gaslit any longer. The lack of visibility into the product and services we are paying for is not going to improve.

1 Like

The change is caused by

  1. Policy: after the launch of GPTs, OpenAI encountered problems from many aspects of society, such as a lawsuit from NY, causing the need to change AI behavior.
  2. Real time learning. Currently, GPT is widely available. Use learning from RLHF to quickly learn various behaviors. But there are disadvantages of learning this way. Such as not following specific usage instructions.

I was mail these issues to OpenAI and they currently working on a solution.

I don’t know how OpenAI is going to solve this problem because they don’t like to say anything publicly about anything… but definitely we are in a similar situation as you are.

This is not only true for GPTs but for ChatGPT also and indirectly in a smaller extent to the API including GitHub Copilot… it is frustrating for me because I don’t have any kind of way to switch to an other platform for now… I wish one day the competitors will be better and equivalent so that we can just get to use an other platform to make OpenAI wake up and make the improvements necessary to get everyone happy, satisfied and impressed… Well click bait people are still impressed with this but it’s just smoke and mirrors for what I understand…