openai.error.InvalidRequestError: Please set an "HTTP-Referer" header with the URL of your app

I have an application that allows people to make requests to ChatGPT bringing in their own key. Its a pass thru method, we never store the key we just pass it along to the API and provide an interface that is useful for authors. And it was working for weeks but then around 3-4 days ago after some activity it will start erroring out with the error messages below in the logs:

openai.error.InvalidRequestError: Please set an “HTTP-Referer” header with the URL of your app

INFO:openai:error_code=400 error_message=‘Please set an “HTTP-Referer” header with the URL of your app’ error_param=None error_type=None message=‘OpenAI API error received’ stream_error=False

I am using the Python OpenAI library and I upgraded to the latest version which didn’t fix it.

When I look at the documentation I don’t see examples in the Python libraries where headers are included so a bit at a loss of what to do.

Hi @amazingjoe
Does your BYOK app routes the API call through your servers?

The BYOK app (which is on the cloud server we manage) makes the API calls directly to OpenAI. An update I have recently set a HTTP-Referer to the headers of the outbound calls so hopefully that might resolve the issue (fingers crossed, I thought I had it solved with an upgrade to the latest version of the Python OpenAI library and turned out not to be the solution).

But its a mystery as to why this errors pop up after a certain amount of time. I wonder if it is some sort of flag seeing multiple keys coming from the same IP or something like that or perhaps the volume of calls coming from a single IP address when several users might be utilizing the app at the same time. I am confused on the best way to isolate the cause that the calls go from working to eventually being denied due to lack of a referrer.

That’s probably the reason why this error might be occurring. The endpoint is seeing too many API keys coming from the same IP.

Pretty interesting that they are monitoring BYOK apps and good too.

Ideally, BYOK apps should be making calls from client side though. That way they won’t be encountering this error.

Also do you store the API keys on your servers?

You could maybe test your hypothesis regarding the number of API keys used if you send many requests from a single API key during a low activity period.
Admittedly , I would dislike paying for a bug hunt but it could help validate the claim that is caused by the number of API keys used from a single IP.

OpenAI is not exactly approving the pass-through BYOK method you are describing.
From the docs:

Production requests must be routed through your own backend server where your API key can be securely loaded from an environment variable or key management service.

Link: OpenAI Platform

I suppose the suggested method should solve your issue if that’s the root cause.

That’s for non OpenAI-users using an an app that runs using dev’s API keys.

Who is a non-OpenAI user in this case? The users bring their own keys and the dev offers the service for them to use.

And what are you the hosting provider of a backend server for client production requests?

We are like a beefed up version of OpenAI Playground with some automations thrown in for authors. They bring their keys, and they can have access to a prompt cookbook with all sorts of author specific prompts, when they get their completions they can have them automatically saved to Notion. We provide a number of convenience functions and are the middleman to tie all these systems together to help our club members be more efficient.

One has to read only to the second paragraph of the terms of use:

You may not make your access credentials or account available to others outside your organization.

So how do you reconcile that its ok for a developer to use a service like Heroku to store an OpenAI key in an environment variable that is used to call OpenAI? If “Salesforce/Heroku” and “Amazon/AWS” are the “others” that can’t be shared with?

Perhaps “others” doesn’t mean machines when organizations set up safeguards so that data is not accessible to staff, perhaps it means don’t share with other people. Do you disagree?