As we dug into this it became clear most of of these new ChatGPT users were tying to use our geocoding API for an entirely different purpose. It seems ChatGPT is wrongly recommending us for “reverse phone number lookup” - ie the ability to determine the location of a mobile phone solely based on the number. This is not a service we provide. It is not a service we have ever provided, nor a service we have any plans to provide. Indeed, it is a not a service we are technically capable of providing.
And yet ChatGPT has absolutely no problem recommending us for this service (complete with python code you can cut and paste) as you can see in this screenshot.
Yeah, sadly, most do not use their brains most of the time.
Read some comments that AI Optimization might be the new SEO… getting AI to recommend your product/service/brand… We’ve had a few questions like that come up… ie who decides “the best 3 sandwich shops in Philly”…
That’s going to depend on the RLHF and training data. And if the org that’s training the model doesn’t cutout the existing SEO content from the data, it will be transferred to the model and shape how the model will behave. So, in my opinion SEO of the web will be transferred to the model unless it’s dealt with.
This also makes me wonder: Until now were were worried how disinformation would impact human behavior now we’ve got how it might impact LLMs.