OpenAI Developer Community
ChatGPT jailbreak prompts
Community
N2U
July 8, 2023, 1:13pm
7
Maybe, just maybe
Someone could write an a tool to read that website and phrase it as an eval
1 Like
show post in topic
Related topics
Topic
Replies
Views
Activity
The Prompt-Defender Initiative: Advancing GPT Safety Standards
Prompting
gpt-4
,
chatgpt
,
api
3
2126
May 22, 2024
Unveiling Hidden Instructions in Chatbots
Bugs
bug
,
risks
18
9842
February 5, 2024
Bounty announcement for Mitigating Prompt Injection Attacks on GPT3 based Customer support App
Community
7
3092
July 11, 2023
Someone stealing my GPT listed
Plugins / Actions builders
plugin-development
11
2033
October 25, 2024
Jailbreaking to get system prompt and protection from it
API
prompt-engineering
2
3792
December 9, 2023