Bing cannot tell the difference between single and double quotes

Bing produced some python code which contained a single and double quote used together creating obvious invalid code. What’s more when confronted about it he was unable to tell that the code I copied and pasted to him contained an error. Even more strange is that ChatGPT 3 was able to spot it immediately. Strange there is such a difference between them since I thought Bing was GPT, and version 4 at that?

He produced code that contained the following line and was also unable to tell that the line is not valid python code:


Since it uses a single quote on the left and a double quote on the right.

It was able to tell for me, at least when asking it directly about that specific line. Perhaps it just didn’t detect it in a larger context. Are you using the ‘more creative’ mode? Apparently that’s the only one that uses the actual GPT-4 model.

No I was using the precise mode because I was to understand that was the only one that used GPT-4, are you sure about that? Precision would seem to be more like ChatGPT4. I even opened a new chat with Bing in precise mode and he could not tell what was wrong with the code at all, even after pointing out exactly where it was and what the problem was he was unable to see it at all.

I doubt GPT4 is the issue though since as I said in my first post, I asked ChatGPT 3 and he was able to spot it instantly. What’s weirdest is that this was code actually PRODUCED by Bing AI…it’s the only time he’s ever done it but I would have thought that was impossible.

I actually original posted the entire block of code and asked him to find the problem, not just the single line as you asked him. ChatGPT was able to read through the code and find the error whereas Bing was not. I think perhaps it may have influenced him since I said it was code that he had produced in a previous conversation so he could not accept their would be an error.

If you check here you can see that the code was actually produced by Bing in the first place, which is way weirder than him not being able to recognise the error:

Here’s my conversation with Bing in creative mode. I pasted the entire function containing the error and stated that it was code created by him in a prior conversation. Take a look for yourself and see how he just can never see the problem no matter how clear I make it. At the end he says there’s a typo so perhaps has finally seen something that ChatGPT 3 saw instantly.