For anyone looking to deploy scalable and cost-efficient micro services using AWS Lambda for your OpenAI project, check out my repo:
You can also use Lambda concurrency to implement rate-limiting required by OpenAI!
For anyone looking to deploy scalable and cost-efficient micro services using AWS Lambda for your OpenAI project, check out my repo:
You can also use Lambda concurrency to implement rate-limiting required by OpenAI!
hello, I modified your code to use 3.10 but it doesnāt work. Why donāt you use an official aws amazon linux ?
Nicely done @eren! Iāve also published something somewhat similar. See FullStackWithLawrence/aws-openai.
This creates A REST API implementing each of the 30 example applications from the official OpenAI API Documentation. Implemented as a serverless microservice using AWS API Gateway, Lambda and the OpenAI Python Library. Leverages OpenAIās suite of AI models, including GPT-3.5, GPT-4, DALLĀ·E, Whisper, Embeddings, and Moderation.
Cool, but why wouldnāt I just create my own lambda layer from the OpenAI SDK?
Is this for people that donāt know how to create their own layers? It only takes a few minutes! What am I missing?
Same question here, but I get āService Unavailableā messages when I try to call e.g. openai.audio.transcriptions.create
with the OpenAI Node.js API from an AWS Lambda.
Do I need some kind of OpenAI Lambda layer if Iām just using the OpenAI Node.js API?
Update: my issue could be Usage Limit related, investigatingā¦
If it helps, when the Whisper API was first announced, I had trouble calling it. But you can call it directly without the SDK.
Here is a post where the Python version of that code lives, and this is what I run from Lambda.
Hey, I see that the layer was built using the OpenAI version 0.27.4. I tried updating it by doin pip install openai==1.2.1, but when creating the layer in lambda and testing the code I am getting the error:
errorMessage": āUnable to import module ālambda_functionā: No module named āopenaiāā
Anyone mind helping out?