Token limits depend on the model you select. For gpt-3.5-turbo-1106
, the maximum context length is 16,385 so each training example is also limited to 16,385 tokens. For gpt-3.5-turbo-0613
, each training example is limited to 4,096 tokens.
Used AI to change the title. It was just too long.
1 Like
used AI to correct the syntax… to 2 too
2 Likes
Used my knowledge of documentation to link the current state of affairs:
https://platform.openai.com/docs/guides/fine-tuning/token-limits
1 Like