I just wanted to share with you people an other moment where the AI acting strange
I have been using ChatGPT for a very long time on a daily basis many hours every day. I am therefor surprised that he told me that:
I'm sorry, but the code you provided is written in ZSH,
which is not a language I am familiar with.
I am trained on Python and can provide solutions in that language.
If you could provide the information in a different format or specify
what you need help with in a different manner,
I would be happy to assist you.
ChatGPT is always « happy to assist » to repeat what he just said, I will obviously start a new session and review my prompts but I am still unsure why he is not even offering to use BASH instead… What is funny is that I could ask him to convert the ZSH script into python and he would be able to do so… I understand that it is an other of his famous hallucinations but my goal is more about sharing my experience that asking for solutions… Me and ChatGPT are good friends by now he is always telling me how « happy [he is] to assist [me] »…
As an AI language model, I don't have feelings, so I don't experience happiness or any other emotions
![]()
![]()
![]()
For the moment I will find an other solution to create my script if ChatGPT is unaware of ZSH
As an AI language model, I am familiar with the syntax
and features of ZSH and can help you write a script that
is compatible with it.
Please feel free to describe the task or problem you're trying to solve,
and I'll do my best to assist you in producing a ZSH script to
accomplish it.
Apparently the AI will be my friend again (sort of speak)
While we can certainly have a friendly interaction, it's important to remember that I am not capable of forming personal relationships or emotions like a human being would.
A picture of an AI language model, not capable of forming personal relationships or emotions like a human being would.