I am making an iOS app that makes use of chatGPT.
To protect my openai api-key, I don’t want the app to send requests directly to openAI (because the requests that contain the api-key can be intercepted by bad actors).
Is there an open-source proxy code that I can use on my server side (self-hosted) for this purpose. I would like streaming support as well, if possible.
I suppose if you only make the requests on the server-side with an OpenAI key as an environment variable, your OpenAI key will be as secure as any other sercret environment variable on your server. Making an OpenAI proxy to handle user usage and cost limits based on that one key would be another issue, and an entirely valid concern.
We are building an LLM-specific gateway with a user auth flow to generate API keys for each user and then use those to monitor and control costs and access https://llmetrics.app
I have DMed you with more info 
It is true that one should not expose the OpenAI API key in the client. However, if you don’t want to spin a backend or cloud function there are services that can use a JWT token to safely proxy to the OpenAI API. ServerlessAI and Backmesh are two examples. Full disclosure, I built Backmesh and open sourced it because I was tired of spinning up backends.
Hello,
I solved this problem with an Amazon EC2 virtual machine with a nodejs express server, where I maintain several paths that access OpenAI services.