Best Approach for Providing Examples: Conversational Format vs. Directly in the Prompt

Which approach is more effective when designing prompts: providing examples in a conversational format or including them directly in the prompt?

1 Like

Depends on what you want to do:

Providing examples in a conversational context can be useful - it’s called multi shot prompting. There can be issues with this (such as tainting the context), but it’s a common approach

It really depends on your use case, and which model you’re using.

One thing to remember is that there is no real difference between “The Conversation” and “The Context” - it’s the same thing.

If you want to understand this better, I suggest you play with the completion models a bit. The chat models aren’t any different - the only distinguishing factor is that the API is hard coded to insert special tokens into the prompt to make it look like a conversation.

2 Likes

I’ll experiment with these approaches to see what works best. Thanks for the insights!

1 Like