Need help with Lambda Layer and Python 3.11

I am trying to get a simple python script to work in AWS lambda. I created a lambda layer am getting an error

Response
{
“errorMessage”: “Unable to import module ‘lambda_function’: cannot import name ‘OpenAI’ from ‘openai’ (/opt/python/lib/python3.11/site-packages/openai/init.py)”,
“errorType”: “Runtime.ImportModuleError”,
“requestId”: “832925f8-9ae8-4151-8dbc-fceac5ded282”,
“stackTrace”:
}

Function Logs
START RequestId: 832925f8-9ae8-4151-8dbc-fceac5ded282 Version: $LATEST
[ERROR] Runtime.ImportModuleError: Unable to import module ‘lambda_function’: cannot import name ‘OpenAI’ from ‘openai’ (/opt/python/lib/python3.11/site-packages/openai/init.py)
Traceback (most recent call last):END RequestId: 832925f8-9ae8-4151-8dbc-fceac5ded282
REPORT RequestId: 832925f8-9ae8-4151-8dbc-fceac5ded282 Duration: 1.92 ms Billed Duration: 2 ms Memory Size: 128 MB Max Memory Used: 103 MB Init Duration: 1351.37 ms

Request ID
832925f8-9ae8-4151-8dbc-fceac5ded282

Here is my script I am testing

import json
import os
from openai import OpenAI

api_key = "sk-xxxx"
client = OpenAI(api_key=api_key)

def lambda_handler(event, context):
    # Initialize OpenAI client
    client = OpenAI(api_key=os.environ.get("OPENAI_API_KEY"))

    # Parse the input data from the event
    try:
        body = json.loads(event['body'])
        user_text = body['text']
        user_action = body['action']
    except KeyError as e:
        return {
            'statusCode': 400,
            'body': json.dumps({'error': f'Missing key: {str(e)}'})
        }
    except json.JSONDecodeError:
        return {
            'statusCode': 400,
            'body': json.dumps({'error': 'Invalid JSON format'})
        }

    # Process the text
    response_text = process_text_with_openai(client, user_text, user_action)

    # Return the response
    return {
        'statusCode': 200,
        'body': json.dumps({'response': response_text})
    }

def process_text_with_openai(client, user_text, user_action):
    if user_action == 'spell_check':
        prompt = f"Correct the spelling and grammar: {user_text}"
    elif user_action == 'simplify_for_child':
        prompt = f"Explain to a 5-year-old: {user_text}"
    elif user_action == 'translate_to_spanish':
        prompt = f"Translate this to Spanish: {user_text}"
    else:
        return "Invalid action selected."

    response = client.chat.completions.create(
        messages=[{"role": "user", "content": prompt}],
        model="gpt-3.5-turbo"  # Change model as needed
    )
    return response.choices[0].message.content



user_input = "The quick brown fox jumps over the lazy dog."
selected_action = "translate_to_spanish"

response_text = process_text_with_openai(client, user_input, selected_action)
print(response_text)


Any help would be greatly appreciated.

1 Like

having the same issue, i’m using the layer for 3.9 from GitHub

erenyasarkurt/OpenAI-AWS-Lambda-Layer

Are you on the latest OpenAI library version?

Try this to update the packages.

pip install --upgrade openai

i have built my own OpenAI layer which works in AWS Lambda Layer:
from openai import OpenAI

you can download it from here.

2 Likes

This worked for me! Could you please share how you created your lambda layer zip file?

I kept getting “No module named ‘pydantic_core._pydantic_core’” and tried resolving by installing in a linux environment and copying the files over which did not work.

i had the same error also, its was due to the fact thati used windows to build the layer, once i switched to linux it was flawless; the full pip command i use is:

pip install openai -t . --only-binary=:all: --upgrade --platform manylinux2014_x86_64
5 Likes

thank you m8 that helped a lot

I just used an ec2 instance conncted to an teamspeak server to get the ziped layer from the fileexplorer of the teamspeak to download on my windows machine and include in terraform.

layers over layers, going mad…

Hi all, I was having this same issue and was unable to resolve it by creating an ec2 and zipping the openai package into a Lambda layer.

In case anyone was in a similar spot, I was able to resolve by keeping the openai layer and adding the AWS provided “AWSLambdaPowertoolsPythonV2.” This got it to work perfectly

runtime
Python 3.12
x86_64

to avoid issues make sure that python and pip versions are matching, and python on machine you build the zip and lambda python are equal

Hi all,

I’ve created ready to use AWS Lambda Layer for the latest version of openai. You can find its .zip over my github repo. Also I’ve shared steps to create custom lambda layer for any of your desired openai version.

Github username: syedfaiqueali
Repo name: aws-lambda-layer-openai

Thanks!

1 Like