The Difference in Token Limits: GPT-3.5 1106 vs 0613

Token limits depend on the model you select. For gpt-3.5-turbo-1106 , the maximum context length is 16,385 so each training example is also limited to 16,385 tokens. For gpt-3.5-turbo-0613 , each training example is limited to 4,096 tokens.

Used AI to change the title. It was just too long. :slightly_smiling_face:

1 Like

used AI to correct the syntax… to 2 too :wink:

2 Likes

Used my knowledge of documentation to link the current state of affairs:

https://platform.openai.com/docs/guides/fine-tuning/token-limits

1 Like