How to pass prompt to the chat.completions.create

Hi,
just updated the OpenAI Python library to 1.8.0 and tried to run the following code:

client = OpenAI(api_key="xxx")
response = client.chat.completions.create(
        model="gpt-3.5-turbo",
        prompt='Be short and precise"',
        messages=messages,
        temperature=0,
        max_tokens=1000
    )

I have this exception “create() got an unexpected keyword argument ‘prompt’”. After looking in the code, I see there is no “prompt” argument anymore. The documentation on GitHub doesn’t have a clue on it.
What am I missing?
Thanks

1 Like

Checkout the model endpoint compatibility page…

Some models use Completion (Legacy) and some use the newer Chat Completion (ChatML)…

You can drop in gpt-3.5-turbo-instruct model and it will work like old completion (Legacy) models (ie a prompt instead of messages object…) OR you can update your code to use the Chat Completion method.

Hope this helps clearing it up a bit.

Thanks. The problem I need to use both - messages and prompt, not only one of them.
After searching I guess the prompt is being passed as a message too, but not from the assistant, but from the system.
The documentation doesn’t have the explicit note but it looks like this:

from openai import OpenAI
client = OpenAI()

completion = client.chat.completions.create(
  model="gpt-3.5-turbo",
  messages=[
    {"role": "system", "content": "You are a helpful assistant."},
    {"role": "user", "content": "Hello!"}
  ]
)

print(completion.choices[0].message)

From here: https://platform.openai.com/docs/api-reference/chat/create
Not sure why they didn’t provide at least a quick note on it.

You would send the “prompt” here as a user message. The System message is more for general rules, etc.

Is there a reason you need both?

The model to gpt-3.5-turbo-instruct is the old completions, so if you dropped that in, your old code would work…

Yes, the reason is I created the chatbot that can be integrated into the Crisp chat. My clients write and pass their own prompts, and they all want the chatbot to know the whole history of the messaging, know the context, and keep the conversation.

Yeah, if it’s a chatbot, you’ll definitely want to change your code to use ChatML (Chat Completion endpoint)…

Can you handle the code from this point knowing the difference? If you get stuck, post up some code, and we can try to get you sorted.

I’m sorry I don’t follow you. Isn’t my code already using the chat completion endpoint?

1 Like

Ack, sorry. I was thinking of another thread…

Yes, you had the right Chat Completion code… except for trying to send “prompt” instead of adding the user’s prompt as a User message…