Assistants API > GPT4 & 4o think year is 2023

When using the AI assistant with both GPT-4 and GPT-4o, when asked “what is the current year”: “The current year is 2023.”

Using a custom GPT it says 2024 correctly.

ChatGPT has an injected prompt telling it what it can and cannot do to an extent, and that injected prompt includes today’s date.

If you use API platform products, you have to create this custom dynamic prompt yourself :slight_smile:

2 Likes

Got it. Injecting current date in additional instructions of thread run works great. Thanks a lot!

2 Likes

@Diet When you say “If you use API platform products, you have to create this custom dynamic prompt yourself” what exactly do you mean?

Because I am still getting October 2023 when I asked it which date is today. I have added in the prompt something like “You will make your calculation based on the current CEST zone”.

Is there something else I can do then if I need the Assistant to recognise if a message is sent more than 48 hours before a deadline or less than that?

Welcome to the community!

You can just add something like this to your system prompt:

It’s now 17:19
Thursday, 25 July 2024
Central European Summer Time (CEST)

You will make your calculation based on the current CEST zone
etc, etc.

Thank you @Diet !

I see. But then it will stay stuck at 25.07.2024, 17:19 and won’t give dynamic answers, right?

So let’s say that on a daily basis the Assistant needs to check if the messages are more or less than 48 hours before a dynamic deadline. It won’t be able to recognise that if I add this in its system?

Ah, assistants. Didn’t read properly. Sorry.

Yeah. Apart from creating a new assistant every day, you can consider sending the current date with every user message, if it’s always relevant.

alternatively, there’s always the option to add a function so the assistant can retrieve the current date as needed.

Assistants are kinda limited in this regard, unfortunately. With completions, you could just dynamically change the date in the past with every message.

Hi @Diet !

No worries! :slight_smile:

What do you mean with your alternative option “alternatively, there’s always the option to add a function so the assistant can retrieve the current date as needed.”? How could I add this function? Because I tried telling it to retrieve the current data, but it didn’t work neither. Do you have an example I could use maybe?

Not intending to sound rude, but when you use the API you have to do some work yourself. You’re making calls to an API endpoint, presumably using code, injecting the date into that message should be trivial.

The date won’t get stuck unless you hardcode it in your code that calls the API but why would you do that?

Also, you cannot give instructions like "You will make your calculation based on the current CEST zone”. The LLM cannot, unless you provide it via tools, access anything external.

If you need it recognize messages sent more than 48 hours before a deadline you need to write code to provide the required information in your system message and/or via tools that you provide.

The OpenAI API is not a chat platform, it is an LLM API, with some convenience for assistants, true, but if you want anything but a very basic chat application, you need to write the code to achieve anything the LLM cannot do by itself. Code interpreter and file search are the exceptions, but those are really just pre-made tools, and shouldn’t be seen as platform or LLM capabilities per se.

In the playground they give you an example with the weather:

https://platform.openai.com/docs/guides/function-calling

Instead of weather, you’d just return the date and time. I hope that helps!

1 Like