Batch API returns error Project does not have access to model `gpt-5-mini-2025-08-07-batch`"

Hello everyone!
When trying to use batch API with gpt-5-mini I’m always receive an error

{“id”: “batch_req_68bec586756c8190b11eb8b98a9eb0f5”, “custom_id”: “test-2”, “response”: {“status_code”: 403, “request_id”: “95350754f38164ef02380670390eb213”, “body”: {“error”: {“message”: “Project `——` does not have access to model `gpt-5-mini-2025-08-07-batch`”, “type”: “invalid_request_error”, “param”: null, “code”: “model_not_found”}}}, “error”: null}

1 Like

Welcome, @Max_K. Sorry to hear you’re having problems!

Project `——` does not have access to model `

That makes it sound like it’s a permissions issue.

Maybe try minting a new key and make sure it has all relevant permissions?

Good luck, and let us know how it goes!

Thanks for your input, I’ve started from this, because found this advise in chat here - created a new project and new API keys and double checked if I have models in limits allowed - but it don’t work. Partially issue was resolved by putting gpt-5-mini-2025-08-07 instead of gpt-5-mini in batch requests via responses endpoint. But still this is not normal workflow.

1 Like

Thanks for the additional details.

I’ll send this along, but no promises. :slight_smile:

1 Like

There is a persistent issue that OpenAI has had with project provisioning, affecting the batch endpoint.

If you see “-batch” appended to the model name, the likely self-service solution is to create a newly-generated project to run your batches with, with its own API key being necessary.

Reminder that server-side artifacts are unexpectedly and incompletely “scoped” to a project ID, so moving more “Responses” calls against asset ID that exist can be problematic if you use any of the offered services that collect, contain, and imprison your application data.

3 Likes

Hi everyone!

Jumping in here quickly to see if I can offer any help.

For anyone blocked, can you try using the snapshot model name first (e.g., gpt-5-mini-2025-08-07) when calling the batch/Responses endpoints? If that doesn't work, create a new project and a fresh API key and retry. If you still get an error, please write into support@openai.com with the failing request_id, the project id, and exact timestamps so we can escalate with logs and the batch team. If you have any issues reaching support, let us know here and I'll follow up.

1 Like