After updating the openai Python module, I found that my embedding code is no longer working. So I went to the documentation here to verify proper syntax with this update and found the following example:
res = client.embeddings.create(input = [text], model=model)
When I try this, however, I get the following error:
TypeError: ‘CreateEmbeddingResponse’ object is not subscriptable
which I can get around by accessing the embedding using dot notation :
Is this a glitch with the documentation and/or the new openai module, or is something funky with my environment?
Been foolin’ with this crap all day. Not just you!
You might start at a well-disguised migration guide:
Then read about the changes from prior dictionary-like openai objects:
Nested request parameters are TypedDicts. Responses are Pydantic models, which provide helper methods for things like serializing back into JSON (v1, v2). To get a dictionary, call
Typed requests and responses provide autocomplete and documentation within your editor. If you would like to see type errors in VS Code to help catch bugs earlier, set
List methods in the OpenAI API are paginated.
This library provides auto-paginating iterators with each list response, so you do not have to request successive pages manually:
client = OpenAI()
all_jobs = 
# Automatically fetches more pages as needed.
for job in client.fine_tuning.jobs.list(
# Do something with job here
Heading links are to the “next” branch of python library with readme.