OpenAI Developer Forum
How to avoid GPTs give out it's instruction?
Prompting
gpt-4
polepole
August 30, 2024, 4:00am
19
We do not need same phrases.
Prompto 1
|
Prompto 2
|
Prompto 3
You may visit
THIS topic
show post in topic
Related Topics
Topic
Replies
Views
Activity
There's No Way to Protect Custom GPT Instructions
Community
custom-gpt
54
11056
April 19, 2024
How to Avoid the Prompts/Instructions, Knowledge base, Tools be Accessed by End Users?
Prompting
gpt-4
,
chatgpt
,
hacking
28
8661
April 25, 2024
Basic safeguard against instruction set leaks
Prompting
gpt-4
,
chatgpt
,
bug
,
prompt-engineering
,
gpts
46
7251
March 4, 2024
Unveiling Hidden Instructions in Chatbots
Bugs
bug
,
risks
18
6156
February 5, 2024
How to prevent malicious questions / jailbreak prompts / prompt injection attacks when using API GPT3.5
API
5
4028
March 6, 2023