ChatGPT bug (Oct 4): Chat history off = current conversation session getting wiped after short inactivity

It’s fixed for me. I am able to use it as usual.

2 Likes

Thank you for letting us know! It’s still wiping the conversation for me.

2 Likes

That’s a bit odd, but I hope you all can get a solution for this ASAP.

4 Likes

Some one posted that this has been fixed, I still see it happening today ,is there any official fix or statement released by openai

2 Likes

I tested it again, and I was hopeful, but it still cleared itself after all.

1 Like

This is the reply I got from support, I’m not sure if this is on purpose to force everyone to turn on history or if it is a Bug at this point -

Hello,

Thank you for reaching out to OpenAI support about your issue with chats occasionally being cleared.

If your chat history is disabled, all unsaved chats will be deleted from our systems. If you want to retain your chat history, you may follow these steps:

  1. Go to your ChatGPT settings
  2. Select “Data Controls”
  3. Turn on “Chat history & training” by moving the slider.

Please be aware that this setting does not synchronize across different browsers or devices. You will need to enable it each time you log in on a new browser or device to ensure that all your chat history is being saved.

Hope this helps.

Best,
Hen
OpenAI Support

3 Likes

:neutral_face: Wow…just wow! This reply does not seem to address your question, and it’s just telling us to turn our history back on.

4 Likes

Thank you so much! I really appreciate your help and expertise, @laszo and @_j ! Your advice means a lot.

I hope the bug will be fixed by itself, but if not, I might consider using scripts. Thank you again!

2 Likes

Yeah this doesn’t adress the problem at all for the use case where I can’t have chat history or training turned on and want to have a conversation longer than a few minutes.

5 Likes

On the third bug report in the help center I got something new, at least there is some sense here:

Hello,
Thank you for reaching out to OpenAI support.
We genuinely appreciate the detailed information you’ve provided regarding the issue you’re experiencing with ChatGPT. We understand how crucial it is for users to have a seamless experience, and we apologize for any inconvenience this may have caused.
To address the issue, we recommend enabling the chat history from your end. This will prevent the chat from automatically closing after periods of inactivity. Once you’ve enabled the chat history, you can still opt out of our use of your data to improve our services by filling out this form: Opt-Out Form. Upon submission, new conversations will not be used to train our models.
For a more comprehensive understanding and to address any future concerns, you might find our Data Controls FAQ helpful.
Thank you for bringing this to our attention, and we’re committed to assisting you until the issue is fully resolved.
Best,
Ian
OpenAI Support

Added links from the response.

4 Likes

EDIT: I asked ChatGPT:
I have a question about this opt-out form. What does this mean? “Please note that in some cases this will limit the ability of our models to better address your specific use case.”

ChatGPT says:
The statement “Please note that in some cases this will limit the ability of our models to better address your specific use case” means that if you choose to opt out of having your data used to improve the AI models like ChatGPT or DALL-E, it could potentially reduce the ability of these models to perform well for your particular needs. In other words, by opting out, the AI models may not be as effective or accurate in addressing the specific tasks or problems you use them for because they won’t have the benefit of learning from your data and experiences.

I asked:
Oh really? Why not?

ChatGPT says:
The reason opting out of data usage may limit the ability of AI models like ChatGPT or DALL-E to address your specific use case is that these models rely on large amounts of data to improve their performance. When users interact with these AI models and provide data through their interactions, it helps the models understand and learn from various contexts, languages, and problems.

By opting out and not allowing your data to be used for model improvement, you are essentially withholding valuable information that could enhance the model’s capabilities. The AI models won’t have the benefit of learning from the diverse inputs and experiences of users who choose to participate, which can limit their ability to adapt and provide more accurate responses to specific use cases.

In essence, your data contributes to the continuous improvement of the AI models, making them better at understanding and addressing a wide range of topics and tasks. When you opt out, you are opting out of this collective learning process, which can affect the model’s ability to excel in your specific context.



I asked:
If we opt out of privacy reasons, are you going to suddenly be less effective? The form says “Please note that in some cases this will limit the ability of our models to better address your specific use case.” That is kind of scary.

ChatGPT says:
I understand your concern, and it’s essential to clarify that opting out of data usage for privacy reasons does not mean that AI models like ChatGPT will suddenly become less effective for privacy-conscious users. The statement you mentioned is intended to convey that when you opt out, you won’t be contributing your specific data to help improve the model’s performance. However, it doesn’t mean that the model will perform poorly or become less effective overall.

AI models like ChatGPT are designed to provide valuable assistance and information to users while respecting their privacy preferences. If you choose to opt out for privacy reasons, the model will still function and provide responses based on its pre-trained knowledge and capabilities. It won’t become less effective in the sense of not working at all.

The aim is to give users the option to control how their data is used, but it’s not meant to deter users from using the service. You can still expect reasonable performance from the model while protecting your privacy preferences by opting out.

2 Likes

It’s a severe (hopefully not intended) bug that limits the usability of the application massively to the point of making it worthless. I don’t think there should be a focus on workarounds or an opting-out form when all that’s needed is the “chat history off”-feature to work as expected; after all most here are paying users. I guess the best for now is for everyone encountering the issue to get in contact with customer support demanding the restoration of the previous 6h limit.

6 Likes

I’m planning to cancel my PLUS account for this month. If they fix this, I’ll come back. We accomplished a lot with ChatGPT :smiling_face_with_tear:

2 Likes

so sorry to hear that people are still facing this issue. But i don’t know how it is fixed for me.

3 Likes

Thanks for the TamperMonkey fix!! Works great!

I reported a bug, but the answer is very frustrating, since they call it a “feature” and not a bug:

Hello,

Thank you for getting back at us.

We apologize for any inconvenience, but in cases where your chat history is disabled, the prompt will be automatically deleted after a few minutes or upon refreshing the website. We value your feedback on this feature and have made note of your comments for internal review.

Please feel free to reach out to us if you have any further inquiries or issues.

7 Likes

I also had the AI make up words.

1 Like

Wow! :dizzy_face: :face_with_spiral_eyes: :woozy_face: :open_mouth:Thank you for sharing this response with us! It’s pretty frustrating that the company is trying to pass this off as a feature when it’s clearly a bug causing usability problems. :sweat_smile:

3 Likes

I have the same issue. Furthermore it gets deleted when I scroll down.

1 Like

That should go into the history, together with “apple battery degradation”

3 Likes

I expect there was some change made with good intention, like “we need to regularly refresh conversations so the conversation bar is accurate, even with multiple logins, and so that people can detect account intrusions.” …and then wasn’t tested against all the uses.

1 Like