Does ChatGPT use Davinci model which is publicly available or an internal model that is not available to us?
It uses “GPT-3.5” which someone on twitter implied has a larger token window, but I don’t know if it has any other structural differences. It seems like it’s mostly a finetuned model.
I also heard text-davinci-003 called as GPT-3.5. It would be nice to know what the technical aspect is actually.
A blog post from Stream.io analyzed the requests and the model seems to be a variation of davinci-002
If I make a prompt in the sandbox with davinci models and in chatgpt, chatgpt is giving away much more data, the responses are longer, I wonder why
Different settings / models.