Hello,
I have a fairly large dataset, so I want to use Batch API on my fine-tuned model; how can I do this? What endpoint should I call?
I am following the tutorial on Batch API; in the examples, it uses gpt-3.5-turbo-0125
as the model. How can I change it to my fine-tuned-model ID instead?
Example of the request to the API:
{"custom_id": "request-1", "method": "POST", "url": "/v1/chat/completions", "body": {"model": "gpt-3.5-turbo-0125", "messages": [{"role": "system", "content": "You are a helpful assistant."},{"role": "user", "content": "Hello world!"}],"max_tokens": 1000}}