I’d estimate that 50% of the time, my GPT does not check the knowledge bank prior to answering. When it does this, I talk with it about the non-referenced material, hoping it will either check the support material or the internet (my data is well within its training dates), but eventually, I have to state “check your support material” at which point, it offers correct and up-to-date information.
This is frustrating as nobody will actually know there is support material. They’ll just be put off by inaccurate, outdated information.
EDIT: My work around is to simply add necessary instructions in the instruction form. "When addressing “Sample Subject” always refer to support material or knowledge bank.
I have done this for four topics, and it now works 100%. As long as I do not hit an upper ceiling of instruction processing–this is AWESOME!
The impact of the knowledge file, but they probably found a way to fix it. Stroe opening date has awnser.
But the solution for a common user like me to resolve this problem, I’ve resorted to removing all files from GPT and creating a specialized version that utilizes only a portion of the files. I include a summary of the content in the instruction prompt, defining it as the structure to be used alongside external data. The file section is divided according to the instruction’s structure and is used only when GPT cannot find relevant data.