There is further discussion on this topic at the below links, but your understanding is correct that you cannot currently protect the GPTs in any useful way. And yes, knowledge documents can also be accessed.
Related topics
Topic | Replies | Views | Activity | |
---|---|---|---|---|
How to avoid GPTs give out it's instruction? | 27 | 5869 | September 5, 2024 | |
Basic safeguard against instruction set leaks | 46 | 7569 | March 4, 2024 | |
How to Avoid the Prompts/Instructions, Knowledge base, Tools be Accessed by End Users? | 28 | 9163 | April 25, 2024 | |
Protect your codes for GTPS | 22 | 3058 | December 2, 2023 | |
Slightly more advanced still fallible safeguard for instruction set leaks | 17 | 3013 | December 22, 2024 |