eren
1
For anyone looking to deploy scalable and cost-efficient micro services using AWS Lambda for your OpenAI project, check out my repo:
You can also use Lambda concurrency to implement rate-limiting required by OpenAI!
4 Likes
hello, I modified your code to use 3.10 but it doesn’t work. Why don’t you use an official aws amazon linux ?
lpm0073
3
Nicely done @eren! I’ve also published something somewhat similar. See FullStackWithLawrence/aws-openai.
This creates A REST API implementing each of the 30 example applications from the official OpenAI API Documentation. Implemented as a serverless microservice using AWS API Gateway, Lambda and the OpenAI Python Library. Leverages OpenAI’s suite of AI models, including GPT-3.5, GPT-4, DALL·E, Whisper, Embeddings, and Moderation.
Cool, but why wouldn’t I just create my own lambda layer from the OpenAI SDK?
Is this for people that don’t know how to create their own layers? It only takes a few minutes! What am I missing?
1 Like
Same question here, but I get “Service Unavailable” messages when I try to call e.g. openai.audio.transcriptions.create with the OpenAI Node.js API from an AWS Lambda.
Do I need some kind of OpenAI Lambda layer if I’m just using the OpenAI Node.js API?
Update: my issue could be Usage Limit related, investigating…
If it helps, when the Whisper API was first announced, I had trouble calling it. But you can call it directly without the SDK.
Here is a post where the Python version of that code lives, and this is what I run from Lambda.
1 Like
Hey, I see that the layer was built using the OpenAI version 0.27.4. I tried updating it by doin pip install openai==1.2.1, but when creating the layer in lambda and testing the code I am getting the error:
errorMessage": “Unable to import module ‘lambda_function’: No module named ‘openai’”
Anyone mind helping out?