Why Does ChatGPT Think It Can Do Math

ChatGPT can do simple math, but if you ask it to do a something a little more complex like 300 * 123 * 456 it gets it wrong every time. Why doesn’t it just say, like it does for other tasks it can’t do - “As an AI language model, I don’t …”

Hi
It doesn’t “know” if it can or can not do maths (NZ English spelling). If it “knows” anything it is what is the most likely string of characters to follow another string of characters. And from that, you get a result. It’s up to you to judge whether it is wrong or right. But it can’t.

HTH

:upside_down_face:

1 Like

But if you ask ChatGPT if it can multiple three 3 digits numbers it says it can, but it really can’t. My questions is more philosophical. I’m trying to better understand how ChatGPT works under the hood. Seems like a mystery to me and I would like to better understand why ChatGPT behaves like this.

I’m not an expert, but I believe you can think of it as a large autocomplete engine.

It takes your input and determines the next most likely word to follow that input. And then the next most likely word to follow that word. And so on.

The mistake is to think of it as a computer algorithm.

I hope that helps :upside_down_face:

1 Like