How do I use the new 16K window model?

Over the last few days, ChatGPT has become totally unusable for me. It’s memory is nearly completely gone and it hallucinates very, very badly. The code it generates is wildly flawed, far more than usual and it often completely fails to answer questions. For example, I asked it to refactor a specific prompt that generates content and the response I got was:

'Final Fantasy XVI was released on June 22, 2023 on PS5​[1](https://www.techradar.com/news/final-fantasy-16-release-date-trailer

It gets worse. When I ask it to tell me what this answer had to do with my query it said:


I apologize for the confusion. I misunderstood your previous question. Let’s try again.

You asked about the commonalities between Final Fantasy 7 and Final Fantasy 16.

As of my knowledge cut-off in September 2021, Final Fantasy 16 had not yet been released, and details about the game were limited


Except that my query was about refactoring a prompt to help me produce some output not anything about Final Fantasy.

Then, when I said this:

I didn’t ask about that.

It replied:


I apologize for the misunderstanding. I see you asked about the commonalities between Fantasy Premier League and UEFA Euro 2020 Fantasy.

As both are fantasy football games, they share the following common features:

(I stopped the reply generation at this point)


This is not something that I paid for. Anybody know what’s going on here?

In my test app I changed the model name to this:

response = openai.ChatCompletion.create(
        model='gpt-3.5-turbo-16k',
        messages=[{"role": "user", "content": prompt}],
        max_tokens=1000,
        stream=True,
    )

Ahh, my apologies, ChatGPT is not using the 16K model, that is for API use. You can try it out over at OpenAI Platform (make sure the mode is set to chat) and choose the 16k model, you will need to have a payment method setup on your API account if your initial grant has been used/timed out.