"Usage is temporarily unavailable, please try again soon."

Getting a response as follows from the https://api.openai.com/v1/usage endpoint.

{
	"error": {
		"message": "Usage is temporarily unavailable, please try again soon.",
		"type": "invalid_request_error",
		"param": null,
		"code": "usage_unavailable"
	}
}

I am accessing the end point with proper Authorization headers like this: https://api.openai.com/v1/usage?date=2023-05-15
It says it is temporarily unavailable but at the same time i do get a 400 Bad Request error from Insomnia.

Anyone else having trouble?

Yes—you can keep an eye on the status here:

currently reporting “Elevated error rates across all services” though it also says a fix has been implemented so hopefully everything returns to normal shortly.

There’s currently elevated error rates across all services, making the usage endpoint temporarily unavailable, it should be fixed soon :laughing:

If you want to check the current status you can do that here:

Status is “All Systems Operational” yet I still have the issue.

Thinking back I think yesterday I might have had the same problem.

Can anyone else confirm that the endpoint is currently dead?

As a backup solution you can always check your usage here:

Thank you @N2U . I am aware of that, but I am trying to build a monitor on my end, It’s for a project I am working on.

I can use https://api.openai.com/dashboard/billing/usage for instance, with no errors, but the https://api.openai.com/v1/usage one is giving me problems.

Afaik you only need the date parameter for a successful request, correct?

It’s not terribly important for now but I’d like to know if it is a problem with the endpoint or a problem on my end.

1 Like

The note on the bottom left was a nice addition, I think, to let us know…

noice

Has that ever been a valid endpoint? I’ve always just calculated usage on my end…

1 Like

It is a valid endpoint. You can try to access it yourself. Leave the user blank and use your OPENAI_API_KEY as the password. You will notice that it will tell you something like this

{
	"error": {
		"message": "Missing query parameter 'date'",
		"type": "invalid_request_error",
		"param": null,
		"code": null
	}
}

Endpoints that aren’t valid don’t tell you what parameters they need to return a valid response.
If you add the date parameter like this https://api.openai.com/v1/usage?date=2023-05-15 it will now give you an error similar to the one in my original post.

When it is working it will give you a response in this format

{
  aggregation_timestamp: number;
  n_requests: number;
  operation: string;
  snapshot_id: string;
  n_context: number;
  n_context_tokens_total: number;
  n_generated: number;
  n_generated_tokens_total: number;
}

Nice. Is it in the docs or undocumented feature? Might be why they halted access? I imagine the server holding the stats isn’t the same as the one for completion / chat endpoints, but I imagine they’re trying to keep all their servers up with so many people…

Thanks for explaining!

No problem!

It is not well documented but you will find mentions of it here on this blog if you search.

I also posted this issue (after @grudges - I missed it).

This doesn’t seem to be related to yesterday’s downtime, and as @grudges mentioned, it’s not really documented. Not sure if it’s going away or just part of growing pains, but I also have been using it as part of a project. Hoping it doesn’t.

I’m pretty sure the issue precedes yesterdays downtime.

I really hope it doesn’t go away or at least that we get another endpoint. Although I don’t see why this one would get discontinued but not the other “unofficial” ones…

Let’s just hope we hear something about it soon!

It’s been back since yesterday but sometimes requests take more than a minute to get a response…

Yeah, my project stores the values locally after the day is over. That way, history is super fast. Glad it’s back.