Need help with Lambda Layer and Python 3.11

I am trying to get a simple python script to work in AWS lambda. I created a lambda layer am getting an error

“errorMessage”: “Unable to import module ‘lambda_function’: cannot import name ‘OpenAI’ from ‘openai’ (/opt/python/lib/python3.11/site-packages/openai/”,
“errorType”: “Runtime.ImportModuleError”,
“requestId”: “832925f8-9ae8-4151-8dbc-fceac5ded282”,

Function Logs
START RequestId: 832925f8-9ae8-4151-8dbc-fceac5ded282 Version: $LATEST
[ERROR] Runtime.ImportModuleError: Unable to import module ‘lambda_function’: cannot import name ‘OpenAI’ from ‘openai’ (/opt/python/lib/python3.11/site-packages/openai/
Traceback (most recent call last):END RequestId: 832925f8-9ae8-4151-8dbc-fceac5ded282
REPORT RequestId: 832925f8-9ae8-4151-8dbc-fceac5ded282 Duration: 1.92 ms Billed Duration: 2 ms Memory Size: 128 MB Max Memory Used: 103 MB Init Duration: 1351.37 ms

Request ID

Here is my script I am testing

import json
import os
from openai import OpenAI

api_key = "sk-xxxx"
client = OpenAI(api_key=api_key)

def lambda_handler(event, context):
    # Initialize OpenAI client
    client = OpenAI(api_key=os.environ.get("OPENAI_API_KEY"))

    # Parse the input data from the event
        body = json.loads(event['body'])
        user_text = body['text']
        user_action = body['action']
    except KeyError as e:
        return {
            'statusCode': 400,
            'body': json.dumps({'error': f'Missing key: {str(e)}'})
    except json.JSONDecodeError:
        return {
            'statusCode': 400,
            'body': json.dumps({'error': 'Invalid JSON format'})

    # Process the text
    response_text = process_text_with_openai(client, user_text, user_action)

    # Return the response
    return {
        'statusCode': 200,
        'body': json.dumps({'response': response_text})

def process_text_with_openai(client, user_text, user_action):
    if user_action == 'spell_check':
        prompt = f"Correct the spelling and grammar: {user_text}"
    elif user_action == 'simplify_for_child':
        prompt = f"Explain to a 5-year-old: {user_text}"
    elif user_action == 'translate_to_spanish':
        prompt = f"Translate this to Spanish: {user_text}"
        return "Invalid action selected."

    response =
        messages=[{"role": "user", "content": prompt}],
        model="gpt-3.5-turbo"  # Change model as needed
    return response.choices[0].message.content

user_input = "The quick brown fox jumps over the lazy dog."
selected_action = "translate_to_spanish"

response_text = process_text_with_openai(client, user_input, selected_action)

Any help would be greatly appreciated.

1 Like

having the same issue, i’m using the layer for 3.9 from GitHub


Are you on the latest OpenAI library version?

Try this to update the packages.

pip install --upgrade openai

i have built my own OpenAI layer which works in AWS Lambda Layer:
from openai import OpenAI

you can download it from here.

1 Like

This worked for me! Could you please share how you created your lambda layer zip file?

I kept getting “No module named ‘pydantic_core._pydantic_core’” and tried resolving by installing in a linux environment and copying the files over which did not work.

i had the same error also, its was due to the fact thati used windows to build the layer, once i switched to linux it was flawless; the full pip command i use is:

pip install openai -t . --only-binary=:all: --upgrade --platform manylinux2014_x86_64

thank you m8 that helped a lot

I just used an ec2 instance conncted to an teamspeak server to get the ziped layer from the fileexplorer of the teamspeak to download on my windows machine and include in terraform.

layers over layers, going mad…

Hi all, I was having this same issue and was unable to resolve it by creating an ec2 and zipping the openai package into a Lambda layer.

In case anyone was in a similar spot, I was able to resolve by keeping the openai layer and adding the AWS provided “AWSLambdaPowertoolsPythonV2.” This got it to work perfectly

Python 3.12