Gpt-4-32k => The model: `gpt-4-32k` does not exist

i am using gpt-4 API. but gpt-4-32k does not work even though it mentioned in the document.
what am i doing wrong??

here is the code:

    response = openai.ChatCompletion.create(
        model="gpt-4-32k",
        messages=messages_without_id,
        max_tokens=150,
        n=1,
        temperature=0.7,
        top_p=1,
        frequency_penalty=0.0,
        presence_penalty=0.0,
        stream=True
    )

when i use model=“gpt-4” instead of model=“gpt-4-32k”, it works fine.

1 Like

The larger context 32k token model "gpt-4-32k" isn’t currently available. You can only consume models that are available in the list from /Models endpoint

4 Likes