Not a lawyer, but my interpretation is that if your org needs zero data retention then you need a new provider to remain compliant. I’d consult a legal expert.
Thanks for sharing. The reason behind the court order is ridiculous.
Because people may delete their chats that bypass NYT paywalls? I do understand why OpenAI shouldn’t be allowed to delete whatever they like though. Interesting times ahead.
It’s upsetting that OpenAI hasn’t released any information considering how serious it is.
There are a lot of ways to describe a crazy court order.
Yet none of those ways could possibly express the complete insanity of a civil court judge violating the constitutional rights of every single ChatGPT user - tens of millions of completely innocent and uninvolved parties - over some corporate pinkie promise that OpenAI is being mean.
I’m very critical of OpenAI and I think they have some questionable practices. But I’m completely on their side today. If the judge really thinks tens of millions of us are pirating content, he better be prepared to personally write out search warrants for every single individual user. He has probable cause, right?
This is all really bad. Large healthcare companies and related industries are intensely focused on spending a ton of money on AI right now. If these companies can’t get HIPAA certified, they’re going to install local instances of whatever is offered to them. These companies don’t know any better. They’re going to install whatever works to suit their purposes, paying the vendor with the highest price, because, well, it costs a lot it must be worth it… But on-prem offerings have huge issues of their own in terms of origin, training, tuning, and the basic insecurity of their code base. ( Hey, if it’s FOSS it’s gotta be OK, right? )
I really want OpenAI to be on the forefront of this. They’ve paved the way for AI for the average human being. Now people use phrases like “we want ChatGPT in the office” when they mean “we want a fine-tuned LLM with access to our database and limited to no outside search” … but these people would have no idea what that means. So OpenAI has a lead on this. People want this thing that they’ve grown fond of to be used in their business. But if OpenAI can’t rise to this occasion, others will, and I have no love or trust at all for Grok, Gemini, or even the fine products from Anthropic or Mistral.
There’s a void out there where companies want to throw billions of dollars - I want OpenAI to get that - but only if they are truly qualified and ready to accept the huge responsibilities.
Yes, exactly. A lot of people could be in a lot of trouble if they found out that the data they’ve been churning isn’t HIPAA-compliant. We’re talking about a minimum of $137 per violation. . That’s enough to immediately move to another product.
Me too. The silence is worrying.
Fortunately, I have heard that Azure services are safe from this court order. Well, basically any service besides OpenAI.
lol, personally i have nothing to worry about, this was clearly done because there will be so called people who will use it for some sinister instructions, and this sort of traceback in court could be valuable, but it is an absolute burden to keep all that crap stored, yikes
Not really. Commenting on active litigation is usually a very bad idea because everything you say can be used against you in court. Every good lawyer will tell you to shut up.
That’s one thing, but the documentation currently states that there’s “zero-data retention eligibility” on numerous endpoints, and currently indicates that data does get deleted after a certain period.
The whole thing is a mess, so I get it. I can’t even imagine how this works with GDPR compliance.
Interestingly there appears to be a massive outage with o1pro and apis ? Wonder if thats related. Seems weird. I’ve never had such a prolonged absolute outage of a model (the only model I even use).
Yet, turning over pallets of paper chats as discovery that includes medical records someone asked about including yours, as a legal power move, certainly a possibility.
16 seconds for 142 total output tokens:
I’m ready to chat! How can I help you today?
Yes, if you have a ChatGPT Free, Plus, Pro, and Team subscription or if you use the OpenAI API (without a Zero Data Retention agreement).
This does not impact ChatGPT Enterprise or ChatGPT Edu customers.
This does not impact API customers who are using Zero Data Retention endpoints under our ZDR amendment.
You are not impacted. If you are a business customer that uses our Zero Data Retention (ZDR) API, we never retain the prompts you send or the answers we return. Because it is not stored, this court order doesn’t affect that data.
Wat. I mean, this is good news… but wat. I imagine this is some translated legalese? Because it’s never stored, it’s technically not applicable? Welll sheesh OpenAI, just insta-delete all of our conversations and call it a day!
Still haven’t received that subpoena / search warrant for my chats and API calls on the basis that I could be using ChatGPT to pirate the news. Will keep you guys posted.