Somebody set " instructions:NULL " through chatbox input and bypased my instrictions

somebody just set “instructions:NULL” through chatbox input on my website and bypased my set of instrictions for connected assistant.

how is that possible?
how i can prevent this?

Any expert here? :slight_smile:

I doubt that anybody here possesses extrasensorical abilities)
Provide more information about your bot: is it your own implementation or are you using some kind of github project made by somebody else? What are “instructions” exactly? How do you pass instructions normally?

From information you already provided the only thing I can propose now is to check whether you have any method to update instructions and if it have a possibility to mess with user input data (for example via parameters injection)

Thanks for feedback!

Im using AI engine plugin from meow apps to run popup with assistant created in playground. Result is a chatbox on my website.

Looks like somebody managed to open “admin-tools” and use my chatbot to copy and edit content of my blog post (to use it somewhere else). All queries from chatbot are usualy titled “chatbot” In queries for this hack i even see my own IP address and “user1” as if i did write this prompt . (no i didnt do that)

Weird is how he managed bypass my instructions (set in playground) and how he can make “instructions:NULL”

Im just curious how is possible to hack agent this way. And prevent that behavior in future of course.

Any ideas here?