Hi!
I read online that GPT-3 has about 175B parameters. I’m trying to figure out how many GPT-3.5 has instead and in particular gpt-3.5-turbo-instruct. Can anyone help me? Can you link me some official page about this?
Thank you very much
Hi!
I read online that GPT-3 has about 175B parameters. I’m trying to figure out how many GPT-3.5 has instead and in particular gpt-3.5-turbo-instruct. Can anyone help me? Can you link me some official page about this?
Thank you very much
ok i have asked this question to chatgpt and here is the answer
" Hello!
As of my last knowledge update in January 2022, there wasn’t a specific model known as GPT-3.5. My information might be outdated, and it’s possible that newer models have been released since then. Additionally, I don’t have the capability to browse the internet or provide real-time updates.
For the most accurate and up-to-date information on the GPT-3.5 or gpt-3.5-turbo-instruct, I recommend checking OpenAI’s official website or contacting OpenAI directly. They usually provide detailed documentation and specifications for their models. If there have been any developments or new releases, you should find the relevant information on their official platform.
"
If someone said “pick an exact number using all that you’ve inferred”, about information OpenAI is unlikely to ever disclose clearly, I would say 22B.
@_j ‘s hypothetical answer matches pretty well with a now-updated research paper that thought it was 20b.
This reddit thread clarifies where that number came from, but it would make sense when you consider how cheap it is.
The real answer is only OpenAI know