ChatGPT has remarkable capabilities but MATH is not one of them!

After trial and error with prompt direction, I was able to correct my language to better math equations I needed help with by asking if my solution was correct in the prompt instead of having the bot try and figure it out on its own. I did the math myself and cross checked with the bot to make sure I came up with the correct answer.

Ex.

Me: If I received a 67% on my test score out of 45 questions, how many questions did I get correct and how many were wrong?

ChatGPT-3: reiterates the math equation and before producing the output, it sends a regenerative error (repeatedly) and I mean repeatedly until you ask it to STOP.

Solution: I calculated the equation on my calculator: 45 x 0.67 = 30 (rounding) then I asked in the prompt:

" If I received a 67% on a test with 45 questions, does that mean I got 30 correct and 15 wrong?

ChatGPT:

Yes, that’s correct. If you received a score of 67% on a test with 45 questions, it means you got 67% of the questions correct.

Here’s the breakdown: " "

and so on…

GPT-3 lacks symbolic understanding. Equations involve specific symbols and syntax that might not conform to standard linguistic patterns, impacting GPT’s ability to comprehend them accurately. GPT’s training data predominantly consists of natural language text, and it might lack the diverse and extensive mathematical content necessary for robust equation understanding. The model’s performance is highly dependent on the quality and variety of its training data.

Conclusion: The more we correct CHATGPT and help it understand math, the more efficient it will produce outputs to our inputs. Keep communicating and adding to the community.

Best,

Vanessa