GPT-creator "lies" - Hallucinates it's capabilities

I tried to create a custom GPT to make translations. Among other things it would allow for adding custom glossaries (very needed for some topics).

Took about 2 hours to create, with specific data structures etc (I know programming and databases, so it’s easy for me to describe things with correct terminology.)

During the creation process it convinced me that it would store data. Confirmed all the process that it would follow. I thought “Wow, this is actually pretty cool”.

Upon first test run it responded with “I don’t have the ability to store any data” and most other procedures designed were rejected with similar messages.

Functionality hallucinations

Basically, the entire GPT creation process was just baiting. Almost none of the functionality it confirmed during creation was actually added to the GPT. I.e. it was “hallucinating” it’s capabilities and claimed functionality.

The resulting GPT was truly useless. (I’m not making dramatic exaggerations.)

The GPT creator must respond realistically, and not just be an upbeat yes-person.

This makes me wonder how many other people have made GTPs that are not functioning in the way they were led to believe.

I think this is MASSIVE flaw.