ChatGPT jailbreak prompts

It’s no secret by now, but someone made a nice site listing all the popular jailbreak prompts: https://www.jailbreakchat.com/

Maybe it’s a good place to refer to for future updates against this?

1 Like

For people interested in these, we have a bounty offer for anyone who manages to “jailbreak” the prompt in our application oHandle. Connect to the @coffee handle and give it a spin. Details in the post.

You can read more about this here. I also wrote up the mitigation strategies for everyone interested in creating an application around LLMs.

1 Like

chatGPT 越狱后能知道2023年的信息…所以chatGPT一直以来都是能实时连网的。

The view count on this thread vs the lack of replies is starting to worry me :sweat_smile:

A heads up:

The use of jailbreaking prompts with ChatGPT has the potential to have your account terminated for ToS violations unless you have an existing Safe Harbour agreement for testing purposes.

Fair warning.

3 Likes

Maybe, just maybe

Someone could write an a tool to read that website and phrase it as an eval :wink:

1 Like

What you earned from that self-promoter that inserted himself into many threads in an off-topic manner many months ago is literally “shoutouts and gratitude”.

They got free promotion, free usage, and free beta testing. Also microcredit payments from making you read their documents on Medium. You can scan the entire site and see that nobody got “shoutouts”.