Kubernetes connecting to OpenAI

Hello OpenAI Community,

I’m developing a backend on Kubernetes to communicate with and endpoint connect to OpenAI’s DaVinci engine. Do I need to have my Kubernetes framework be Flask if I’m using Python?

Kind regards,
Julian

2 Likes

First off, welcome to the OpenAI community @julian2001!

You can use any framework that allows you to call the API endpoints, including Flask or Django. I personally have a project going that uses Django as the webserver and uses Celery to initiate tasks that calls the OpenAI API endpoints under each individual user.

I had plans to make it a full-stack business-grade OpenAI instance for everyone to use for free once I get it working, and maybe I still can here soon, but I’ve unfortunately had to put the project on hold for the time being since I have personal matters that have taken precedence currently.

Feel free to reach out to me if you (or anyone else here) plans to use Django for their project. I can certainly help!

2 Likes

Hi Nicholas,

Thank you that’s kind of you, will do. I hope everything works out for you.

Kind regards,
Julian Reyes

1 Like

Hi Nicholas,

Does OpenAI support the CoDel ( Controlled Delay Queuing Development) algorithm for networking?

Kind regards,
Julian Reyes

1 Like

Good question!

If you’re asking about specifics on how OpenAI has designed their infrastructure to handle the huge amount of network traffic, then you’ll receive a more official response by emailing support@openai.com and asking them about that. I unfortunately do not know the answer to that question, and I don’t believe that’s been talked about by OpenAI officials or community members yet.