I am trying to fine tune gpt-4o-2024-08-06 following the guidelines at:
https://platform.openai.com/docs/guides/vision-fine-tuning
I have created JSONL file for training and validation with embedded base64 encoded images.
train.jsonl is around 610 MB.
When fine tuning program starts, it tried to upload the train.jsonl file. But it exits with following error:
————————-
[INFO] OpenAI client initialized with extended timeout.
[INFO] Uploading Training JSONL: data/train_base64.jsonl (610.9 MB)
Starting upload of 610.9MB…
Uploading: 610.9MB / 610.9MB (100.0%)[ERROR] Failed to upload Training after 1 attempts: Error code: 400 - {‘error’: {‘message’: ‘File is too large.’, ‘type’: ‘invalid_request_error’, ‘param’: None, ‘code’: None}}
[ERROR] Aborting due to upload failure.
Is there a limit on the size of JSNOL file? If yes, then what is the limit?
How can we train a model if our dataset has more than 1000 images?