Fine Tuning, job failed due to an internal error

I’m trying to train via fine-tuning GPT-3.5-turbo-1106. But after loading the file, which happens correctly, when I try to train it gives me the following events:
Created fine-tuning job: …
Validating training file: …
Files validated, moving job to queued state
Fine-tuning job started
The job failed due to an internal error, re-enqueued for retry
Fine-tuning job started
The job failed due to an internal error, re-enqueued for retry
Fine-tuning job started
The job failed due to an internal error
An example of a conversation of my training data is as follows:
{“messages”: [{“role”: “system”, “content”: “Sei un assistente per la gestione degli ordini di acquisto di prodotti. Il tuo compito \u00e8 estrarre il codice articolo, la quantit\u00e0 ed eventuali caratteristiche richieste per ogni prodotto richiesto dal cliente. Se il cliente per un determinato prodotto si riferisce al solito o qualcosa di simile, cerca nel database tramite ask_database nella tabella ‘product_purchased_by_the_customer’ il prodotto pi\u00f9 acquistato per quella categoria di prodotto.”},{“role”: “user”, “content”: “Email: xzcvb@prova.com, Oggetto: Ordine urgente, Testo: Buongiorno, vorrei acquistare due QWERT456”}, {“role”: “assistant”, “content”: “”, “function_call”: {“name”: “ask_database”, “arguments”: “SELECT pr.item_number, pr.warehouse_quantity FROM product AS pr WHERE pr.item_number = ‘QWERT456’;”}}, {“role”: “function”, “name”: “ask_database”, “content”: “[(‘QWERT456’, 10)]”},{“role”: “assistant”, “content”: “”, “function_call”:{“name”: “order_products”, “arguments”: “‘lista_codici_articoli’: [‘QWERT456’], ‘lista_quantita_articoli’: [2], ‘lista_caratteristiche_prodotto’: [‘None’], ‘email’: ‘[my email]’”}}]}
At first I thought it was a problem related to the number of tokens, so I reduced it to 10 examples for a total of 7746 tokens so I don’t think that’s the problem here.
Does anyone know where the problem might be?

months later and i am running into a similar issue. what was the problem and how did you fix it? or if anyone has had a similar issue. thanks!

I believe that in this case would be good to provide at 10 lines of your example data, then someone else can try in his side replicate the error and help, with this example and error is very unlikely that someone can guess what is going on. What I can do here is:

I will provide 2 lines of my example than you can compare with your data.

{"custom_id": "0", "method": "POST", "url": "/chat/completions", "body": {"model": "gpt-4o-mini", "messages": [{"role": "system", "content": "You are a helpful assistant."}, {"role": "user", "content": "Was Abraham Lincoln the sixteenth President of the United States?"}], "max_tokens": 1000}}
{"custom_id": "1", "method": "POST", "url": "/chat/completions", "body": {"model": "gpt-4o-mini", "messages": [{"role": "system", "content": "You are a helpful assistant."}, {"role": "user", "content": "Was Abraham Lincoln the sixteenth President of the United States?"}], "max_tokens": 1000}}