Do users' custom instructions with with GPTS?

Does anyone know if custom instructions with with GPTs?

This support article mentions plugins but not GPTs.

I did some testing and it seems as though custom instructions don’t influence custom GPTs, but would be interested to know if that’s intentional or not.

It appears that, no, Custom Instructions are not forwarded to GPTs.

You can test this yourself by putting a simple piece of information in your Custom Instructions, for instance your name, then asking the GPT what your name is. It will not have that information.

I am guessing this is intentional for,

  1. Token-management
  2. Ensure consistent, predictable, and safe behavior of the GPT

I do think it’s unfortunate, because there are fantastic use cases for combining the two in terms of customization of the GPT for individual users.

For, instance, having one IT support GPT and having in the instructors of it to put something like OS=Debian 11 in the custom instructions so that the user doesn’t need to specify their OS with each request to avoid getting instructions for Windows and OSX in every response.

But, I do think it’s probably for the best at this point as it frees up those tokens for the GPT to use and it reduces the risks of information leaks and the ability for the user to trick the model into overriding actions and making a request to the API the builder didn’t intend and isn’t equipped to handle (imagine an API endpoint to say Stable Diffusion Diffusion with a dimensions parameter, overriding this with too high of a value could potentially bork the system).

1 Like

They do not. Customgpts have their own set of custom instructions. In fact there’s no difference between a custom gpt and your normal bot with the same custom instructions. Except this saves copy pasting different custom instructions when you wish to change what your bot does. And the context window is I creased from 1500 characters to 8000 characters

3 Likes

Why would it be of any value. Custom gpts use their own custom instructions that you get to write exactly the same as the ordinary custom instructions. So you can already customise it exactly how you want

Also custom gpts can now be retrained if the user supplies the training data in the correct format.

2 Likes

Perhaps I wasn’t as clear as I could have been.

The utility would come from the ability of a user to easily give persistent information to a GPT not created by them.

In the example I gave, if you created a public GPT for computer support, it would be useful for a user to be able to include details of their system in the Custom Instructions so that information would be permanently in context.

They don’t need to modify the behaviour of the GPT, just augment it with personalized information. While there are other ways to make this information accessible to the GPT, none of them keep it in context.

I’m sure there are countless other use cases where it would be beneficial for a custom GPT to have persistent, personalized information pertaining to the individual using it.

2 Likes

Good point. Never thought of other people using my custom gpts. They are all set to personal and when I want them to have different instructions I simply edit them. So I never thought of that use case.

yeah it’s a really great way for users to instruct the model if english isn’t their first language.

One of my main questions is whether or not this is intentional by OAI. Since it doesn’t seem to be mentioned in support it could be something that has been looked over or it’s intentional to not mess with the setup of the GPTs.

Has anyone come across specific wording from OAI that Custom Instructions will not apply to GPTs?