hello everyone !
I would like to report something that I have noticed once again!
unfortunately it happens very often that i ask a question and chat gpt gives an “absolute” answer…
so something that sounds completely reasonable and safe…
but when I ask again and again he doesn’t know the answer or it’s not true…
this has often led to me investing a lot of time in trying something that doesn’t work.why doesn’t chat gpt just write that it’s not right or that he has no answer?
for example i asked if an app has google juicenet.
chat gpt has answered: yes
then i asked if he is sure
to which he replied that he didn’t know and that I should ask the app operator.
the big problem is that at first it sounds as if he is sure and you don’t always get the idea to ask again.
why doesn’t he write instead that he doesn’t know for sure or only suspects? i find this extremely misleading, especially because this really happens extremely often.
another point of criticism is that chat gpt often replies to things i didn’t even ask about or constantly repeats things.
ok you can turn that off if you write to him three times and ask him to answer only and exclusively to the question.
can you generally do something about it?
so that he only ever answers the question ? and possibly only answers for sure if the answer is certain and otherwise makes it “optional” or somehow indicates that he assumes that it is because…
but is not really sure ?
another point is that he often makes mistakes, for example when you use him to program something…
you tell him to send you the source codes for different parts of the project and he then sends you two different codes…
the memory function does not yet exist in the eu, does it ?
i think that’s a shame because it’s actually a very ingenious thing!
but so far it still promises a lot that it can’t quite keep somehow…
there are still a lot of restrictions…
I also think the censorship is a shame, but ok, chat gpt can only do so much about that…