Strange GPT-4 Vision behavior

Has anyone else noticed odd behavior from GPT-4 Vision lately? It’s suddenly interpreting all images as sensitive, regardless of content. I’m trying to understand if this is a common issue.

A few days ago with correct answer:

And today with several retries:

And a diagram a few days ago:

And same image this evening:

Test image with kingfisher:

Test image with diagram:
diagram_0

Anybody who can explain this behavior?

Sure. GPT-4 machine vision is broken in ChatGPT. The images are likely encoded into garbage. The AI will confabulate if you tell it “bird on cup” instead of telling you “macroblock nonsense seen”.

Try uploading a 24 or 32 bit PNG and see if it can decode and see that.