Is this deliberately ambiguous?

This does not state which law you’re complying with. Are you complying with the GDPR or are you complying with the legal mandate law of the court? It also does not answer the question, which would have determined this answer, it’s a simple yes/no.

Does this court order violate GDPR or my rights under European or other privacy laws?

We are taking steps to comply at this time because we must follow the law, but The New York Times’ demand does not align with our privacy standards. That is why we’re challenging it.

They are complying with the US court order and sacrificing GDPR compliance.

It seems quite vague, i’m not entirely sure if I could place a point of challenge by stating it was deliberately done so, but I feel like they could’ve delven deeper on it.

I’m surprised they said anything about it at all. It’s a lose-lose situation for them because they have to break some law to comply with the other.

If that’s true it tells you everything you need to know when it comes to protecting privacy.

OpenAI could easily strip the data of customers in the EU/UK on which GDPR applies, and realistically if it’s held in an EU data centre, the court has no jurisdiction on.

No mention of whether they would do that.

I can see why they sided with the court, to avoid contempt, to avoid prejudice and to show they act in good faith.

That doesn’t stop this leaving me with a dirty taste that they’re selling out rather than properly defending, particularly with vetted PR statements hiding facts, being vague and evasive.

In fairness, they’re based in the US. The US can raid their corporate office, the EU can’t. So naturally a US court order trumps overseas regulation. Literally. :slight_smile:

If this didn’t apply to Europe then OpenAI would have gladly said so. So I think it’s fair for us to deduce that they’re keeping European data too.

They can keep European data in a European datacenter to help with overseas compliance, but domestic compliance will always take their priority.

If they didn’t want to answer the question, I don’t understand why they didn’t simply omit it. We know that “all” ChatGPT users and API calls literally means “all” with only the given exceptions. We also know that this violates GDPR, which doesn’t recognize outside court orders. So it really wasn’t necessary at all for OpenAI to write such an ambiguous answer.

If they really believed the court order was illegal, they wouldn’t have complied with it. Their statements seem to be just for show, and appealing is a delay tactic they would have done anyways.

In any case, I think the court needs to decide who it’s ordering here. If the order is for OpenAI’s data, then its customers are completely uninvolved and need to be subpoenaed separately. If it’s an order against every individual user, then it doesn’t impact European data.

US courts trying to play god and “have it both ways” is nothing new. But an AI company standing for privacy certainly would be.

Is this for eu citizens or services offered originating or purchased within gdpr zone? Then it’s enforceable everywhere as far as I know