Ability to specify the first part of the assistant's message in chat models

The chat models accept the messages parameter instead of the prompt parameter. Currently if you want the assistant’s reply to start with a specific sequence, you have to say, in the previous message, something like: “Your response should start with {thing:" and should only be JSON5. It should not contain any text other than the JSON5, which doesn’t quote property names”. This works most of the time but is far from being ideal/robust in various situations.

(Note: The above is just an example - please don’t focus on this specific use case, it’s just a simple example to illustrate the point.)

Feature Request
Add the ability to also specify a prompt parameter which is used to start off the assistant’s message. For example, the user message would say “Generate some JSON5 data with …” and then you’d set the prompt parameter to {thing:" to start the assistant off on the right track.

So, currently: Chat models accept messages instead of prompt.
Whereas, desired: Chat models accept both messages and prompt.

1 Like

Or, even better, full templating akin to Jsonformer (except it should allow other template formats - i.e. not just JSON):

1 Like

I’m thinking / hoping that things like this will be coming in the future with ChatML…

1 Like

You can solve this in a simple way in the message prompt.

Make sure you have the following in the prompt to improve the response

  • In the instructions specify you need a JSON object as a response
  • Specify an example response in a JSON format
  • At the end of your promt end with “JSON OBJECT:”, so the AI knows the following response should be a JSON Object

You can try this in the playground so you can see it working.