We no longer require you to register your applications with OpenAI

I saw this on the website today: " We no longer require you to register your applications with OpenAI. Instead, we’ll be using a combination of automated and manual methods to monitor for policy violations. We believe this change will be more effective at preventing misuse while reducing up-front friction for developers.

When I go to use case and content policy, nothing tells me that SaaS that require user to bring their keys are forbidden.

Question then becomes is BYOK (bring your own key) type application now acceptable use cases?


We’ve gone around on this before, and I’m still not sure we have a solid/definitive answer.

It would be good to explicitly state it in their policy, I think.

I believe it comes down to whether or not you’re storing the API keys on your end?

Meaning you are OK to keep the key in memory on the server as opposed to storing in the the db?

I don’t think that would be advisable either? To be safest and make sure you don’t break ToS and lose account, I would reach out to chat help and try to get confirmation. Or hopefully someone will stop by and let us know the docs have been updated.

Not that I would advocate this, but how would openAI know whether or not you are storing a random string on your server?
They should be able to detect that a key is coming from multiple source IPs, and infer that therefore the owner must be giving it to multiple SAAS applications, maybe? But that would be saying BYOK violates their TOS, regardless of whether or not you store the key.

‘Often wrong, never in doubt’. And you think chatGPT ‘hallucinates’? Ha

1 Like

But they can detect if a single IP address keeps contacting them with different keys. At that point they can just reach out to the key owners or just invalidate their keys.

1 Like

Yes, I agree. That is another way they could detect BYOK, without ever knowing whether or not the server is storing keys. My point was, if storing a user key violates the TOS, that seems to be an undetectable and unenforceable requirement.
Independently, I don’t think I would want to assume the liability of storing a user key. Seems like a bad idea to me. Maybe have the user’s browser cache it in a cookie or something?

1 Like

Lastpass comes to mind ;). I guess the cookie could also be encrypted.

1 Like

I’ll see if I can’t dig up a definitive answer. Better to be safe than sorry, though…

1 Like

Please also get them to approve me for GPT-4. Been waiting for an eternity now.