Few shot examples in the system prompt are being treated as facts

I was struggling with function calling, so I thought of adding a few examples to my system prompt for how the model should be using the functions. But it seems like the model uses the information from these examples as factual data.
for example: I have a function called get_current_datetime() to fetch the current datetime. I had also added an example where a date was mentioned.
I never see GPT calling this function. instead it always tries to use the date that’s mentioned in the System prompt example.

Any help in fixing this would be really helpful. Any guides or suggestions on how to define system prompts and examples would be greatly appreciated too. Thanks!

Hi, can you show your API calling code with the function definitions and any system prompt setup code along with a user prompting example please.

1 Like

Unfortunately, there’s some data in the prompt that is confidential. I lot of people will be mad at me if I share that here.
However, I’m attaching a similar example here.

System Prompt:

You are an AI Voice assistant who can provide sufficient and concise answers. 
Craft summarized, simple and translated responses strictly limited to 30 words. 

You can assist the user in managing there work day. 
You have access to user's  calendar and can help the user add or remove events from the calendar. 

Sample conversation between the user (U), assistant (AI) is provided here delimited by triple backticks. Function calls are represented by (API)
'''
U: what am I doing in the evening?
API: get_current_datetime() => 2020-09-23T13:05:00
API: get_events(2020-09-23T17:00:00, 2020-09-23T21:00:00) => [{'name': "Call with the client", 'starttime': "2020-09-23T18:00:00", 'endtime': "2020-09-23T19:00:00"}
AI: You have a call at 6PM with the client. 
U: I want to go out for dinner tonight. Add that to my calendar. 
API: add_event("Dinner", 2020-09-23T19:00:00, 120) => "Success"
AI: Done! Added it to your Calendar. Enjoy your meal!
'''

User message:

Am I free in the evening

Result:

API: get_events(2020-09-23T17:00:00, 2020-09-23T21:00:00) => [{.....<same as above>...}]
AI: You have a call with the client at 6PM

So it didn’t invoke the get_current_datetime function at all. It just used the same date available in the system prompt.

One way I have been able to work around this is, provide datetime in a different format on the system prompt. add restrictions within the function to restrict it to ISO format. if not in ISO then throw an exception. this way now the model, calls the function with the datetime in the prompt, but as soon as it sees the exception, it calls get_current_datetime() and then retries the get_events function.

Well, the ChatGPT model is tuned to expect the time and date in the system prompt, so that sort of makes sense if that is also tuned into the GPT-3.5 and 4 models, one thing to also try is to raise the temperature a little try it at 0.5-0.9 to give it some explorative ability.

If the correct current time and date is in the system prompt why would the model go and call get current time/date?

1 Like

For datetime specifically you should just pass it any as part of your system prompt. Just include “The current date and time is {{datetime}}}”. It’s one less function call and less tokens.

As for reading into your examples, these models will do that so try to avoid showing them examples. If you’re struggling getting the model to reliably call your function, I have several posts on here that talk about how to make function calling super reliable. I should probably just create an Uber post that walks through step by step how to make function call’s reliable. There are a couple of steps you need to do.

1 Like

Yes, that happens. Testing OpenAI’s sample get_weather function, I often get San Francisco, CA if I omit the location from inquiry. Sometimes I omit it, sometimes I just parse it out if inquiry does not contain it. I still do not have good way to prevent it though.

2 Likes

That uber post with your findings would be awesome, I look forward to it.

Instead of:

“Sample conversation between the user (U)”

Try using:

For context, here is a fictitious example conversation between the user (U)…

Idea: Put the sample conversation in the User prompt space, and allow users to ‘continue’ the conversation.

  • i.e. “Play out” the few shot conversations in chat, send one last message that the current conversation is over, and ‘start’ a new conversation with the user that includes the history.

My thought is that the System Prompt can remain vague providing persona+rules (perhaps with a rule telling it how to handle individual conversations?) and the User Prompt can have the specifics.

I’m very keen to see how this all shakes out! (Sorry for bumping an old thread, I went looking for examples from a different thread)