I am facing two issues with OpenAI’s Batch API and cloud storage:
-
Batch Processing Stuck: I have a batch processing job that has been stuck in the “in_progress” state for over four days. The job neither completes nor expires.
-
Cloud Storage Limit Exceeded: After attempting to free up space, I used
client.batches.list()
to retrieve all batch jobs and deleted them usingclient.files.delete(file_id)
. However, when I try to upload and create a new batch, I still receive the following error:Error code: 400 - {'error': {'message': 'You have exceeded your file storage quota. Organizations are limited to 100 GB of files. Please delete old files or attempt with a smaller file size.', 'type': 'invalid_request_error', 'param': None, 'code': None}}
Despite deleting all existing jobs and files, the cloud storage system still indicates that my storage usage exceeds the 100 GB limit, preventing me from uploading new files. Am I using the wrong method to delete the files and free up storage?