chatGPT-4 context lengths


What is the context length of the GPT-4 model in chatGPT? The models page in the openAI documentation says chat optimized GPT-4 has a context length of 8192. While this paper from Azure AI says that GPT-4 in chatGPT has a context length of 4096.


Welcome to the community.

I don’t have access to GPT-4 but I have found that documentation to be reliable in that regard. Remember, these are tokens not characters, and are calculated by the sum of the input and output.
I hope this helps :upside_down_face:

1 Like

Thanks for your response.

I was able to find an API endpoint describing the different models thanks to this repository.

Turns out that the GPT-4 backend for chatGPT only supports 4095 tokens :slightly_frowning_face:

Full response from backend-api/models:

{'models': [{'slug': 'text-davinci-002-render-sha',
   'max_tokens': 4097,
   'title': 'Default (GPT-3.5)',
   'description': 'Optimized for speed, currently available to Plus users',
   'tags': [],
   'qualitative_properties': {'reasoning': [3, 5],
    'speed': [5, 5],
    'conciseness': [2, 5]}},
  {'slug': 'text-davinci-002-render-paid',
   'max_tokens': 4097,
   'title': 'Legacy (GPT-3.5)',
   'description': 'The previous ChatGPT Plus model',
   'tags': [],
   'qualitative_properties': {'reasoning': [3, 5],
    'speed': [2, 5],
    'conciseness': [1, 5]}},
  {'slug': 'gpt-4',
   'max_tokens': 4095,
   'title': 'GPT-4',
   'description': 'Our most advanced model, available to Plus subscribers.\n\nGPT-4 excels at tasks that require advanced reasoning, complex instruction understanding, and more creativity.',
   'tags': [],
   'qualitative_properties': {'reasoning': [5, 5],
    'speed': [2, 5],
    'conciseness': [4, 5]}}]}
1 Like

I did find out that the context length (max tokens) of GPT-4 in chatGPT is 4095. I don’t believe this is public info and I am not sure why

1 Like

It is public. Right there in the docs where they list all the model’s and their context.
GPT-4 is 8k GPT-3.5-Turbo is 4k. They are also planning to come out with a GPT-4-32k, but that’s not available right now.
You can see how many tokens a string is by using the tiktoken library. Also in the docs.

I am referring to the GPT-4 model available on for Plus members. It seems like this version of GPT-4 only supports 4095 tokens and not 8k/32k as for the API endpoints.

1 Like

Indeed, it seems it is not the full 8k context and 4k sounds about right. After a long conversation, GPT-4 forgot some earlier details prior. I counted the tokens using the transormers library in python, from beginning to end of the conversation, and it was ~6k.