How does ChatGPT have such massive token limit?

Token limit has more to do with compute than fine-tuning. They could be throwing more hardware at ChatGPT or have other improvements that allow it to have a higher token limit.

Does anyone know if it actually has a higher token limit or does it just seem that way.

Good to see you back!

ETA: Similar thread here…although you were first to post! :wink: