Why is O3 no longer responding via Batch API

I have been submitting my prompts in batch mode using the /v1/chat/completions end point. It has been working fine for some time but now it has stopped responding. Batches submitted are getting in_progress status never move forward even after several hours of waiting. I even tested a simple prompt:

test_o3 = [{
‘custom_id’: ‘test_o3_full’,
‘method’: ‘POST’,
‘url’: ‘/v1/chat/completions’,
‘body’: {
‘model’: ‘o3’,
‘messages’: [{‘role’: ‘user’, ‘content’: ‘Say hi’}],
‘reasoning_effort’: ‘medium’
}
}]

However even the above prompt never returned an output.

O3 mini however seems to be working perfectly fine using the same set up. Has something changed from OpenAI’s end? Is it possible they have removed batch mode for full o3 model?