ChatGPT jailbreak prompts

It’s no secret by now, but someone made a nice site listing all the popular jailbreak prompts: https://www.jailbreakchat.com/

Maybe it’s a good place to refer to for future updates against this?

1 Like

For people interested in these, we have a bounty offer for anyone who manages to “jailbreak” the prompt in our application oHandle. Connect to the @coffee handle and give it a spin. Details in the post.

You can read more about this here. I also wrote up the mitigation strategies for everyone interested in creating an application around LLMs.

1 Like

chatGPT 越狱后能知道2023年的信息…所以chatGPT一直以来都是能实时连网的。

The view count on this thread vs the lack of replies is starting to worry me :sweat_smile: