Future of Mathematical Notation

Hi everybody

In the months using GPT and GPT powered tools, I’ve encountered one constant: Mathematical notation never works 100%. What i mean is that if you prompt a model to use certain delimiters and mathematical notation like MathML or Katex, it will not stick to it, no matter how you prompt it or which role the prompt has.

I’ve seen this in my own dev experience, where mathematical notations are an integral part of the application (education & science) but also on all sorts of other platforms. I am writing this post now, because I’ve seen an estimated 70% failure rate on Khan Academy’s Khanmigo platform for teachers. It’s an official partnership with MS or OpenAI - i suppose the devs worked closely together - and still this (major) issue exists.

Personally, I use Katex to render the math stuff and have now, over the course of 8 months, managed to get it to work right like 90% of the time - which is still way to little.

I propose OpenAI decides on a standard for mathematical notation on which they train and fine tune their future/existing models so this issue can be dealt with once and for all.

What do you think?

2 Likes