ChatGPT pushing away its builder base one update at a time

OpenAi

THIS IS ABSOLUTELY RIDICULOUS! YOU ALL DID AN UPDATE? This new update isn’t allowing me to provide the gpt model with product information for anyone in any industry of any GPT model I have. I’m creating & enhancing current Medical field GPTs I created so that doctors parents and patients can be informed on products to help them whether it be for parents or doctors or students but I’m trying to give them the tools they needs to improve either their job or school learning abilities or patients lives with up to date products on the market to give them the best options with detailed information about these things so that they don’t have to do just a Google search but talk with the GPT and it can help them find the best options for success in all aspects of the GPTs specific industry making their lives better BUT NOW I CANT UPDATE WITHOUT APPEALING AND I CANT EVEN SUBMIT AN APPEAL!!! THIS IS GETTING OUT OF HAND WITH HOW STRICT YOU ALL ARE BEING ON INFORMATION BEING ALLOWED TO BE PROVIDED TO USERS OF OUR GPTS. I’m looking at moving to another platform

You all are killing ChatGpt one update at a time.

According to the Terms of Service:

  • You must not use any Output relating to a person for any purpose that could have a legal or material impact on that person, such as making credit, educational, employment, housing, insurance, legal, medical, or other important decisions about them.

Thus, providing medical advice goes against the Terms of Service and that is likely the cause why the GPT has been removed.

The appeals process will hopefully be launched shortly.

Personally, I see why getting medical advice from Google is a bad idea. Getting medical advice from a GPT is a horrible idea!
This type of information can have serious negative impacts due to the known limitations of the models and must always be checked by an expert first!

It’s not advice it’s searching the internet itself for things from doctors and more advanced than just web MD and its giving people tools and options to choose from with all the details outlined from the sources it’s getting them from. It’s basically just an assistant. But thus is happening to all my gpts. Not just medical assistant gpts.