There is further discussion on this topic at the below links, but your understanding is correct that you cannot currently protect the GPTs in any useful way. And yes, knowledge documents can also be accessed.
Related topics
Topic | Replies | Views | Activity | |
---|---|---|---|---|
Slightly more advanced still fallible safeguard for instruction set leaks | 17 | 3328 | December 22, 2024 | |
Basic safeguard against instruction set leaks | 46 | 8289 | March 4, 2024 | |
How to avoid GPTs give out it's instruction? | 29 | 7245 | June 2, 2025 | |
Plugin injection attack, pseudo code prompts, chain of thought, plugin orchestration, and more | 26 | 6926 | April 14, 2024 | |
How to Avoid the Prompts/Instructions, Knowledge base, Tools be Accessed by End Users? | 28 | 10275 | April 25, 2024 |