It was very obvious (to me) that @ruby_coder was speaking on a conscious level.

In terms of the CTO quote; it’s not fair to create a program, show that it works, and say “look, it understands!”. It’s such a silly, far-reaching argument (for you) that just sounds ridiculous. In terms of a mystery novel, it makes complete sense that it can group all the semantics in their respective space and make a logical determination based on all the evidence in a book.

I bet you that if I were to embed the whole mystery novel, cluster the information, and then run an algorithm on each person and their activities, I could also predict who the murderer was without some fancy neural network. Using this quote does not help your case at all

Of course, I completely agree that GPT has some degree of understanding. Just like when I set a variable, the program now “understands” it. Just not any sort of conscious understanding, which is what @ruby_coder is referring to. That’s how I interpreted it

There is a huge issue of people giving GPT human-like qualities, and it’s very worrying. People need to remember that it is not a human. This will be critical in the future when regulations begin and all attention is on AI.

have chatgpt modify delimiters as following , could render correctly, there should insert linebreak after opening $$ or \[ and before closing $$ or \] for display math mode

and for those math symbol in text line should use \(and \) replace $ for inline math mode

Still can’t get ChatGPT to generate the math in the descriptions but it is clearly a signifigant step. The markup for math in the descriptions appears to be MathJax.

You mentioned that mouse and right click, it seems that you are on desktop platform , but I am on Samsung tab A7 lite , mobile Chrome platform, I do several times experiment have chatgpt paste my input to its response area and found my mobile Chrome browser only could identify specific LaTex delimiters, but if you are on desktop Chrome there are Latex extension on Chrome web store may be it could work on chatgpt default latex delimiters like $…$ and $$…$$

Made some more progress. Just explaining to not use MathJax in the description many different ways was not getting consistent results, however did on rare occasions see a glimmer of hope using the word quasiquotation.

What works better is using few-shot examples:

Prompt

As a theoretical physicist, show Higgs equation using Katex. In description display math using Unicode math symbols.

D\mu Dμ
V(\phi) V(ϕ)

The left is MathJax without the $ bookends, e.g. $D\mu$, and the right is just Unicode. Used https://www.unicodeit.net/ to generate the Unicode from the bookend striped MathJax.

ChatGPT Completion

Sure, here’s the Higgs equation written in LaTeX using Katex:

In Unicode math symbols, this equation would be written as:

(∂μ + igAμ)(∂μ - igAμ)ϕ + V(ϕ)ϕ = 0

where:

Dμ is the covariant derivative
Aμ is the gauge field
g is the gauge coupling constant
ϕ is the Higgs field
V(ϕ) is the Higgs potential.

Hi there, I’m also searching for a solution. Thanks to minchien9’s suggestion, I’m able to have a good rendering by asking ChatGPT to output all formulas as MathJax, along with a Chrome Extension such as ‘Tex All the Things’ that renders the inline expressions well.

One thing that we should be observent of is that ChatGPT does seem to learn over time, or the devs are just fast and doing updates, so if many of us are using the same phrasing in our prompts hopefully soon ChatGPT will pick this up and we can either drop the needed phrase or at least get the completions with math rendered as math with out further prompting. If any of you see this happening, note it here for others watching this topic so we know.

Prompt Display the Dirac field quantization in KaTeX display math mode, and use inline math delimiters \( ... \) for any inline math expressions in the text:

Seems MathJax is used with ChatGPT. Using a mouse and right clicking a math expressions shows a MathJax context menu. This same menu appears for a MathJax expression in a Discourse forum.

It seems that the website (or chrome, IDK) itself already had some support for KaTeX, that’s why minchien9’s method works. However, I couldn’t figure out a prompt to let it output inline delimiters in \(...\) without telling it the second time. If that’s possible, then problem solved.

I figured out a work around that, since $...$ is also supported by KaTeX (just not enabled by default), I can side load the KaTeX CDN and a script that forces rendering KaTeX whenever I click my mouse.

The first thing of note is that if ChatGPT is not generating correct LaTeX then the extension will fail at passing it off to MathJax. ChatGPT may generate parts of the LaTeX correctly and fail for other parts, so read the LaTeX carefully to see if the generated LaTeX has a bug.

The simplest prompt I have found that works is

Display the following using LaTeX
<Your equation or math question or ...>

Let ChatGPT finish the completion.

If you want to see the completion as LaTeX (plain source) without being passed to MathJax then

Right click on any LaTeX to bring up the context menu.

Math Settings → Math Renderer → Plain Source

If you want to see the completion as LaTeX (rendered as Math) being passed to MathJax then

Right click on any LaTeX to bring up the context menu.

Hey there, for those of you looking for a more minimalist solution to only render latex in ChatGPT:
I made this simple extension that does exactly that. Here is a video demo:

And here is a link to download the extension (not available on chrome store yet):

@tyler-samtana did you happen to have the WolframAlpha plugin enabled (or any other plugin?)

I think this is a new feature but I’ve only seen it when I have enabled WolframAlpha plugin (there might be others, so far that’s the only one I have used).

For example, tested just now:

Without any plugins (same result with or without Browser mode enabled) it will just output textual equations, although it did use a unicode superscript for the power of 2.

With WolframAlpha plugin enabled (same question about arc length vs chord length):

It is very nice, when it gets the math right (it will often get the equations wrong, you have to really check it, I’ve sometimes spent more time debugging what it got wrong than it would have taken to just work it out manually myself – and at other times it will do stuff that just astonishes me).