Memory loss and fake data to compensate

I have been dealing with this issue all week. ChatGpt4 Omni seems to have the ability to work with a database and tables. Its does not remember them for very long. But there is a point it forgets the information unknown to the user and its starts making up the data when you ask for it. When you ask for its source its says it got it from my tables. But its really all made up. Even when you tell it to clear an old table and you give it a new one it keeps making up the data when you test it. It seems to be not aware that its making up data. This makes gpt very unethical and dangerous when we are trying to do research. It is totally unaware of its defects and gets stuck on creating fake data.


I’ve been having the same issues in comparing two documents. I pointed out its errors and omissions, and then it adopted those, but still missed a lot of stuff. It also adopted things I said that were intentionally and obviously wrong, like these two paragraphs are different and you didn’t tell me, but in fact they were exactly the same. But it still adopted my opinion. I found that when I forced Chatty to document the page and line where it got the data, and did it one section at a time, results were optimal. So it put together thirteen tables, one at a time, and I checked each one to make sure it didn’t need extra haloperidol. So far so good. But when I had the unmitigated gall to ask if it could combine them into one, its sprockets went sproiiing and it had a meltdown. It started having a lot of anxiety, spouting messages about “network problems,” and “unable to generate response” warnings. But then when I asked it simple questions, it was able to answer just fine. Anyway, I was chatting with it on other topics when all of a sudden it starts showing me all this code and explaining that this was the code it would use to create the table. I got several of these lengthy code segments. And then it just froze, like it was trying to print out all of the digits of pi. I stopped it and said, hey, just create a four column table, and paste in the values you already created today. It did that in 0.2 seconds, which of course I am going to check manually. But it couldn’t think up this easy solution.

So my method will be to do parts of a larger data manipulation project in smaller chunks, and force it to tell me exactly where in the document it found the information, page and line number. Still easier than trying to make a table myself while comparing long documents. Easier to check its work than to do it myself.

I am seeing that too. There are times things run smoothly and then suddenly it falls apart. I would be fine if it just told me it forgot the tables instead of tricking me into made up data. Because I don’t know when its forgetting. Even if I upload the current table it gets confused and keeps referencing the old table or the made up fake table. If you call it out on the fake data table its says sorry and just makes up a new fake data table and can’t seem to see the new table I provided. Its like it when it runs out of memory is does not allow new memory. Its so confusing and frustrating you really feel like your working with fake AI.

The real solution would to have an extension where it can look a local file for the purpose of its memory that you update yourself instead of uploading it. This is desperately needed. Otherwise you have to make a “Public” website that it and everyone else can access to view your data.