How to Avoid the Prompts/Instructions, Knowledge base, Tools be Accessed by End Users?

Thanks, interesting find, appreciate the testing. So this remains an open door. I have not instructed the GPT to ‘only provide interpreted information’ from the document. BTW love the time stamp from pre GPT-4 release :wink:

Thanks, that would be it. Appreciate the testing and results. If this isn’t locked down properly, the future of GPT commercialization remains uncertain. If everyone is able to copy-paste instructions and knowledge documents,… why continue using someone else’s GPT? Hmm… this screen door must be replaced with something more solid before the submarine submerges :wink:

For anyone developing using the API, an additional approach that is interesting, if you need an additional layer of security, is to pass the entire completion into another chat with a prompt asking the API to rate the probability the discussion is staying on track or if the user is attacking the system. I’ve tried a couple of variations of this and it works pretty well. Essentially you’re writing an observer to check on the chat. If the observer notices a problem, you can code your app to end the chat rather than provide the potentially unwanted content to the user.

This is different from using the moderation API, but is a similar concept.

In the GPT email newsletter (1st December 2023) it is sort of confirmed that files are public, and it is the current intended functionality.

Uploaded files are downloadable when using Code Interpreter so we’ve made this feature default off and added messaging to better explain this.