It looks like GPT-4-32k is rolling out

Ooh. Not seeing it yet, but I’ll keep my eyes open. Thanks.

Weird I’m not seeing the 8k version either?

Looks like I do have 8k

This model’s maximum context length is 8192 tokens. However, your messages resulted in 35061 tokens. Please reduce the length of the messages.

1 Like