ChatGPT API Megathread - ASK HERE

I figured it’d make sense to make a megathread as the API has been released and many people (including myself) will have questions/comments.

My current question: I’m trying to incorporate QnA using back-end logic, but it doesn’t seem to work?

messages=[
    {"role": "system", "content": "Today is March 1, 2023. When the user asks a question that you don't know, condense & format the question into a query, prefixed with: QUERY|"},
    {"role": "user", "content": "Hi"},
    {"role": "assistant", "content": "Hi! How can I help?"},
    {"role": "user", "content": "Hi. My question is: When was the ChatGPT API released?"},
    {"role": "assistant", "content": "QUERY|Can you provide information about the release date of ChatGPT API?"},
    {"role": "system", "content": "ChatGPT API was released March 1, 2023"}
]

RESPONSE

('assistant', "I'm sorry, but I'm an AI language model and I cannot access information beyond the present time (2021). As ChatGPT API does not currently exist, I am unable to provide information about its release date. Is there anything else I can help you with?")

Anybody found success or now how this is supposed to be done?

I was able to solve this using the following:

messages=[
  {"role": "system", "content": "Today is March 1, 2023. When the user asks a question that you don't know, condense & format the question into a query, prefixed with: QUERY|"},
  {"role": "user", "content": "Hi"},
  {"role": "assistant", "content": "Hi! How can I help?"},
  {"role": "user", "content": "Hi. My question is: When was the ChatGPT API released?"},
  {"role": "assistant", "content": "QUERY|Can you provide information about the release date of ChatGPT API?"},
  {"role": "system", "content": "ChatGPT API was released March 1, 2023. You may answer the question using this information"}
]

More specifically: I added “You may answer the question using this information” to the end of the system message

{
"model": "gpt-3.5-turbo",
"messages": [{"role": "user", "content": "Hello!"}]
}

I am using this body (the example they have on the documentation) to call ChatGPT API but it’s returning

{
    "error": {
        "message": "you must provide a model parameter",
        "type": "invalid_request_error",
        "param": null,
        "code": null
    }
}

How is this possible?

Are you using the OpenAI Python module? The endpoint is also slightly different

openai.ChatCompletion.create(
        model="gpt-3.5-turbo-0301"

Is what I’m using, but your model should also work

I am trying to call it directly from my web application (Bubble.io - specifically from their API connector plugin) perhaps there’s something wrong with that.

I’d imagine that their endpoints haven’t been updated yet. It’s a different URL

1 Like

Probably. The new endpoint is ```
https://api.openai.com/v1/chat/completions

2 Likes

Hey Ronald, closing this thread so that we can answer each question in it’s own. It will be more sustainable that way!

2 Likes