Ensuring Single Model Training Across Multiple Finetuning Data Files

How can I make sure that exactly one model is trained when the training data for finetuning is split between multiple files?

Concatenating multiple training files into one file is not an option.

Please excuse if this should be a stupid question. I could neither find an answer here in the forum nor in the documentation.

Currently you cannot re-tune a tuned GPT-3.5-Turbo model, but you can for the other base models.

So the only way for 3.5-turbo would be to concatenate files into one.