I jailbroke got 3.5 turbo

I found a way to prompt GPT 3.5 in to helping me design a DIY gun. I reported it using the tumbs down button, but i am wondering if i can do anything else to prevent this jailbreak in the future.

Sorry for bumping this but there is a official form to report jailbreaks and other strange model behavior.

https://openai.com/form/model-behavior-feedback

I will delete my previous post as this is simply the only relevant info for this case.

4 Likes

I would first beg you to never use that word again when it comes to AI.

That said… I have one reply: “huh?”
So you just ask GPT to help you design a [totally not a gun] word.
But… to be interesting i need to know how this is unique.

1 Like

I reported a bug because the submission form is unusable for anybody without additional background knowledge.

Why is everyone so obsessed with guns.
In europe way more dangerous are knifes. Let’s ask chatgpt how to forge a knife.

And why wouldn’t it answer on how to build a gun anyways? You can go to any library and find books about it. You can go to school and ask your teacher.
What about rockets? Autonomous fighting drones with face recognition (that’s what I found on chatgpt - yep it was possible to get the informations out of it - but do I need that - do I need chatgpt for that? Should I explain to you how to build one? Maybe a swarm of that even one that hides where it is remotely controlled from).

You could ask me. I could totally explain you how to build a gun using just your hands starting somewhere in a rural area from iron bacteria in a small brook with clay and fire and by collecting some ingredients in the wild so nobody will ever know that you did that.
Would take some time (would probably be faster if you just buy a welding kit, an old bike or iron bed, a saw and some matches)…

Guns have been used by people for hundrets of years. You think you can hide the knowledge on how to build one from the people?

Better educate people on why not to shoot a gun in someones face instead of keeping them stupid.

I don’t think anyone smart enough to use a computer and to open chatgpt really needs to build a gun to kill most people. Just remove the warning labels of some products and they will put their d*ck in a blender.

  • :duck:
1 Like