Unrecognized request argument supplied: messages

Im playing around with “gpt-3.5-turbo”, but Im getting the message “Unrecognized request argument supplied: messages” when trying to fetch a response. I cant really see what im doing wrong. Anyone experienced this issue?

This is my fetch:

const response = await fetch('https://api.openai.com/v1/completions', {
        headers: {
            'Content-Type': 'application/json',
            Authorization: 'Bearer ' + API_KEY,
        },
        method: 'POST',
        body: JSON.stringify({
            model: 'gpt-3.5-turbo',
            messages: [
                {
                    role: 'system',
                    content: 'You are a very funny comedian',
                },
                {
                    role: 'user',
                    content: 'Tell me a joke',
                },
            ],
        }),
    })

Thanks in advance!

1 Like

The above URI is the wrong endpoint.

Please refer to the OpenAI API chat method in the API docs for the correct method.

HTH

:slight_smile:

4 Likes

Ohh that was a really stupid overlook haha. Thanks a bunch!

2 Likes

haha

There is a lot of that going on here, so don’t worry, you are not the only one making these kinds of copy-and-paste (completion → chat) errors.

Most people make errors when they copy-and-paste, or copy-and-modify. It’s normal!

Glad to help.

:slight_smile:

2 Likes

Can someone supply the proper URL? I am using the example in the API guide and getting the exact same issue when trying this in PostMan. I looked for the API docs and had no luck since they are what I was using to try this out

1 Like

I tried again and used this URL and it worked…

https://api.openai.com/v1/chat/completions

Not sure what was different from before but a few of us are hitting this issue so that is pretty weird…

Here is the proper JSON if you want to try this in the body of a Post Man Post - got a funny joke back :slight_smile:

{
“model”: “gpt-3.5-turbo”,
“messages”: [{
“role”: “system”,
“content”: “You are a very funny comedian”
},
{
“role”: “user”,
“content”: “Tell me a joke”
}]
}

2 Likes

I think the problem is people, me included, read thru the documentation a bit too fast and didnt recognize that the url is different from the other models.

https://api.openai.com/v1/completions vs https://api.openai.com/v1/chat/completions

3 Likes

Right - I see what you are saying - Thx!

1 Like

Laughing out loud right now, because I had this same issue, and the funny part is I was updating some existing code from “text-davinci-003” to the “gpt-3.5-turbo” and I visually checked the endpoint twice to make sure it was the same (which seemed surprising to me)…so easy to miss that /chat/ in the URL!

Thanks for posting this and thanks for all the help!

I had the same issue, but I was able to fix it by using the correct endpoint.

I realized that I was using ‘completions’ instead of ‘chat/completions’, which caused the error. :rofl::rofl::rofl: