How to find the best combination of Batch size, LRM and epochs

Hi does someone know how to find the best Batch size, LRM, and epoch?

For the fine-tuning process

if someone has found out the perfect way to fine tunning then please share
as we are losing on a lot of money in trial and error

Hi there! As the approach to fine-tuning is very use case specific, there is no such thing as a generic best practice approach.

If you can share more details regarding what you are looking to achieve with fine-tuning and details of the nature of the data set you are using, that will increase your chances that fellow Forum members with similar fine-tuning use cases will share some data points regarding the batch size and other parameters that have led to successful outcomes.

While it sounds like you’ve already spent quite a bit of time on the process, just in case I am including here again the link to the documentation. OpenAI has been expanding it quite a bit with specific considerations for data quantity, quality and the other parameters, so you may want to take a second look if you have not already.