Allright guys, it can't be like that

Dear ChatGPT Developers,

I would like to address an important issue that I have observed while using ChatGPT. It appears that there are instances when a user asks ChatGPT a question, and instead of admitting its lack of knowledge, the model generates plausible but inaccurate responses. This can lead to misinformation being spread as people trust ChatGPT as a reliable source of knowledge.

For example, today I requested ChatGPT to recite a Ukrainian poem by Ivan Franko, which is beloved by my girlfriend. I intended to surprise her by displaying the poem on her computer. However, ChatGPT generated random rhymes that only related to the title I provided, and it was only half the length of the actual poem. If I had relied on ChatGPT and presented it to her, it would have caused confusion, embarrassment, and possibly amusement.

It is crucial that we address this issue to ensure the accuracy and reliability of ChatGPT. Misleading information can have far-reaching consequences. I kindly request that the developers take appropriate measures to rectify this problem and ensure that ChatGPT provides accurate responses or acknowledges when it doesn’t have the required information.

Thank you for your attention to this matter.

Sincerely, Sadzic Palic Zalegalizowac

1 Like

Hay Sadzic,

And welcome to the developer community forum, we’re the people who develop “stuff” using openAI’s services, we’re not the people developing chatGPT :laughing:

What you’re experiencing is normally called “hallucinations” or “confabulations”, chatGPT do not have the ability to discern the credibility of facts, it’s trained to generate responses that are most likely to gain positive human feedback.

In your specific case, you asked chatGPT to cite a poem, chatGPT doesn’t know that poem, so it tries to generate one it hopes you will like.

Think of chatGPT as a “writing assistant” that doesn’t know anything about what you’re trying to do, you will have to supply the text, if you want it to cite a specific poem, fact, or any other information.

I hope that helps :laughing:

2 Likes

Quick, send her this:

1 Like