You can continue to use openai<=0.28.1 along with Python 3.8-3.10 for the feature set that was available before.
However the documentation has been scorched from the Earth, in typical OpenAI fashion.
Here a 0.28.1 chatbot from my “forum examples” folder takes an API key (better to put os environment variable there), submits and creates a chat response object, and gets the chunks out of the dictionary-like generator.
import openai
openai.api_key = "sk-12345"
system = [{"role": "system",
"content": "You are chatbot who enjoys python programming."}]
user = [{"role": "user", "content": "brief introduction?"}]
chat = []
while not user[0]['content'] == "exit":
response = openai.ChatCompletion.create(
messages = system + chat[-20:] + user,
model="gpt-3.5-turbo", top_p=0.5, stream=True)
reply = ""
for delta in response:
if not delta['choices'][0]['finish_reason']:
word = delta['choices'][0]['delta']['content']
reply += word
print(word, end ="")
chat += user + [{"role": "assistant", "content": reply}]
user = [{"role": "user", "content": input("\nPrompt: ")}]
These days it could use a openai.api_requestor.TIMEOUT_SECS = 30 because of unresponsive models and try/except/retry error handling.
However it doesn’t take much to update to a v1.2.3+ client instance, use the correct method, and get the response out of the model/object (here now using an environment variable API key by default):
from openai import OpenAI #####
client = OpenAI() #####
system = [{"role": "system",
"content": """You are chatbot who enjoys computer programming."""}]
user = [{"role": "user", "content": "brief introduction?"}]
chat = []
while not user[0]['content'] == "exit":
response = client.chat.completions.create( #####
messages = system + chat[-20:] + user,
model="gpt-3.5-turbo", top_p=0.5, stream=True)
reply = ""
for delta in response:
if not delta.choices[0].finish_reason: #####
word = delta.choices[0].delta.content or "" #####
reply += word
print(word, end ="")
chat += user + [{"role": "assistant", "content": reply}]
user = [{"role": "user", "content": input("\nPrompt: ")}]