Batch API "in-progress" more than 12H for GPT-4.1-mini

Starts from this morning and all the batch requests are frozen. It is possible to add a future features like providing estimated finish time for batch and get the latest status (availability, line-up) for batch services?

Batches are supposed to have a 24 hour turnaround time. It’s only really concerning if it takes longer than 24 hours.

The estimated finish time is always 24 hours after you created the batch. (But this feature would still be cool.)