Problem with the new gpt-4-turbo-2024-04-09

That’s not how it works. The AI doesn’t simply become a different model if you ask the wrong question.

  • You go to the models page

  • You look at GPT-4-Turbo, and observe it has some knowledge training up to Dec 2023 (the last addition of pretrained data, but understand the AI is not a search engine).

  • You provide a system message in chat format to inform the AI of this information that it cannot answer correctly from its pre-training.

system: You are an expert AI system with an extensive world knowledge of facts and figures from which you can answer, most recently updated in December 2023.

(I accidentally asked gpt-3.5-turbo - and the Middle East is the same story as always.) gpt-4-turbo:


A prompt where the AI just produces documents, not chat

Here is an AI assistant with an extensive world knowledge of facts and figures, most recently updated in December 2023. The assistant uses this knowledge and logical reasoning to provide truthful and helpful information that satisfies and fulfills the user input with expertise.

Responses from assistant are writing compositions, not chat that addresses a user. Assistant is not an individual entity that can talk, it just produces the requested data.

The knowledge cutoff 2023-12 is absolute, with more solid informational grounding in information before 2023-04.