Adapting a prompt from ChatGPT to Davinci-03

I understand they use different models. I quite like the response I’m getting on ChatGPT

Is there anything I should keep in mind when writing for Davinci-03 vs ChatGPT

If the above is your complete prompt, try adding “BEGIN:” after the last line of instruction.


Why add “BEGIN:”? How does that do for davinci-03 model? How will the responses be different if you don’t add the “BEGIN:”?

Here are a couple of thoughts on this:

Without “BEGIN:” the Davinci-003 model may attempt to continue writing your instructions instead of following them.

Also, by the nature of the ChatGPT’s conversational interface, the “BEGIN:” part is implicit after you submit your message. To replicate this even more accurately, you may add something like “Begin your response:” to the Davinci prompt.


hi @wfhbrian, could we actually implement this for a QnA prompt? where to put the “BEGIN:” Specifically? to be more clear, in the use case where we put a context in our prompt, is this doable?

In GPT, providing a sample will greatly boost your performance, though ensure that the sample is not too matching with the question you intend to get the answer for, as it might lead to overfitting. Also, building on wfhbrain’s example, you can use Question: and Answer: to tell GPT where to print out the answer from.


@krisbian That tip from @udm17 is a good one.

Are you able to share more details about your prompt?

1 Like