Why GPT-3 cannot learn a new word and use it?

I have noticed the engine does not have a definition of the word “penultimate”. I tried to teach it, here is how it went.

You’re abstracting it too far.

It get’s the core idea, but as soon as you ask it to to math (aka “the xth word in the sentence”), it breaks apart. GPT3 cannot do math! It just repeats things it has seen. Since it has never seen sentences with “penultimate” (or very few), it doesn’t know the correlation between xth word in sentence and penultimate.

Also, the davinci series have a limitation of 4K tokens and the further you speak with it, the further your content gets summarized to fit, potentially loosing context and explanations you did in an earlier stage.

2 Likes

Here is the last attempt. Apparently it can count somewhat (I tried) and it can also learn.

Moreover, the current ChatGPT model is not based on DaVinci model, but something different (I asked) and with a higher token limit than that of DaVinci model.

Sometimes a bit of prompt engineering is useful. See, for example where ChatGPT even blew past my typo to get the correct answer:

Since ChatGPT is a language model and not a calculator, I find it helps to engineer prompts as if we are taking to a language model AI end not a calculator.

HTH

Note: When I need ChatGPT to behave “as a calculator” I generally ask ChatGPT to write a method (in my favorite programming language) and explain to ChatGPT that I want a method to calculate given “this and that” (what I want to calculate and how), and ChatGPT consistently writes a nice, working program to do the calculation.