chatGPT stopped responding across all platforms and devices 12/26/24

Still broken. It keeps saying invalid request. My request was hi

Chat history is still not loading on my desktop browser despite deleting the cookies, caches from the browser and windows desktop app has got the same issue whereas it is working perfectly fine on the phone

OpenAI will run a full root-cause analysis of this outage and will share details on this page when complete.
Dec 26, 20:38 PST


Before you congratulate the developer who is always putting out the fires, remember who built the architecture made of matchsticks… - arata

I got this pop up: ChatGPT now has memory
and i click on enable and nothing happens!
anyone have the same issue?

yes, I have the same issue!

1 Like

I have the same isssue since Thursday.

1 Like

i have it fixt by , downloading the windows app of chatgpt, install it put the vpn off, log in to the app, then configure the vpn to bypass the chatgpt app to not use the vpn. the location of the chatgpt.exe is a little bit tricky but if you search you can find it : program files\windowsapps\openai.chatgpt_…\app
works for me only the chatgpt app uses no vpn, did a dns leak test everything ok.

Btw I predicted the $msft short from the 26th in this thread :laughing:

As of right now I have twice locked up my threads with chatGPT.

It will state “I can search the web now yes.”
Then when asked to search the web, say that it can’t.
Then when prodded in a different way, it will do the search.
Then it will refuse to repeat the task in a similar manner, and say that it can’t because it [can’t do things it just did two responses ago].

Open new window, confirm it’s capabilities, confirm it says it can do the task, confirm it can search the web with a lesser task, try the task.

Sorry, Dave, I’m afraid I can’t do that.

It’s infuriating that it is proving that it can do what is being asked, but is refusing and pretending it can’t… because it doesn’t want to(?)

I wiped my chatGPT memory, as it suggested that I could have accidentally limited it with a prior memorized instruction. Reconfirmed it’s capabilities, got it to swear there was no problem with the task…

…ask it to do the task.

Sorry, Dave…

Another recent symptom is that it will get a response that you explain is not correct and not to repeat it, and it will specifically repeat that response or change a pair of words and repeat the canned response it has come up with, no matter how you prompt from that point on.

It really feels like it has just decided it DGAF.

Three days ago it proved it could do exactly what I wanted and I was so excited to use it to be more productive.

Now it feels like I am trying to get basic action out of an employee who has clearly checked out before quitting.