Best Practices for user Auth safety with custom GPTs?

Hey hey again everyone!

So, it seems people are posting GPT builder-related questions all over the forums here (might be time to make its own section here soon). I figured I’d ask this question in this section, because it’s a more broad scope, but is nonetheless important.

My question is simple: is it safe and/or advisable to allow a custom GPT to prompt for some kind of user auth so it can be integrated in an API call for the user?

Maybe the other thing I don’t understand fully either is whether or not “Authentication” in the actions builder is for us, the developers, or is for the user, so it creates a successful authentication method?

I’m really excited about my tool that I’m building using GPT builder, and I think others will be excited too. However, I’m nervous about how to approach any kind of user auth when it’s needed for certain actions I’m building. I don’t want to prompt the user for any of that information if A. that’s unsafe and/or not recommended, and B. people won’t trust it even if it is safe.

What are y’all doing to handle this? Is anyone else working on something with this kind of issue?

I’m not saying that likely won’t be my solution tbh, but I’d like some kind of piece of mind or acknowledgement that others are solving it the same way. Any help here is greatly appreciated! Thanks everyone!

7 Likes

I’m in the same boat and trying to solve the same problem.

1 Like

So, while we wait for more docs to come out about how to use custom GPTs specifically, I did discover a lot of the docs about plugins can translate into this kind of work with GPT builder.
I looked at this doc in particular: https://platform.openai.com/docs/plugins/authentication

Is this a perfect guide? No. But it’s the best we have to work with atm.

Hopefully this helps you and other people that see this post!