Hi.
What is the context length of the GPT-4 model in chatGPT? The models page in the openAI documentation says chat optimized GPT-4 has a context length of 8192. While this paper from Azure AI says that GPT-4 in chatGPT has a context length of 4096.
Hi.
What is the context length of the GPT-4 model in chatGPT? The models page in the openAI documentation says chat optimized GPT-4 has a context length of 8192. While this paper from Azure AI says that GPT-4 in chatGPT has a context length of 4096.
Hi
Welcome to the community.
I don’t have access to GPT-4 but I have found that documentation to be reliable in that regard. Remember, these are tokens not characters, and are calculated by the sum of the input and output.
I hope this helps ![]()
I did find out that the context length (max tokens) of GPT-4 in chatGPT is 4095. I don’t believe this is public info and I am not sure why
It is public. Right there in the docs where they list all the model’s and their context.
GPT-4 is 8k GPT-3.5-Turbo is 4k. They are also planning to come out with a GPT-4-32k, but that’s not available right now.
You can see how many tokens a string is by using the tiktoken library. Also in the docs.
I am referring to the GPT-4 model available on chat.openai.com for Plus members. It seems like this version of GPT-4 only supports 4095 tokens and not 8k/32k as for the API endpoints.