ChatGPT does not understand the requirement

I asked chatGPT (I’m using Google Docs. I want to add a textbox to add text. Is it possible?). And ChatGPT told me the full way to add a textbox instead of just saying “possible or not”. Why did ChatGPT give such long answers?

The answer: human feedback used as training.

Let’s say I start typing into ChatGPT:


“I observed that Python is tending towards strong typing and type hint annotations when using functions in newer versions. Can I adapt a function get_weather() that returns a python list containing two dictionaries to have typing in a way that not only developers can use the type data, and type checkers can check a code base, but it will also be compatible with previous versions?”

Correct answer, processing the question logically:


Pretty unsatisfactory, though. You can imagine that’s not actually what the user wants, even though that’s the AI answer requested. The AI has to see past the logic to find the desire to be fulfilled in order to not get the downvote and negative reinforcement training.

(BTW a ChatGPT Plus GPT share that doesn’t waste a single token on explanation)


1 Like