"This model's maximum context length is 8193 tokens" Does not make sense

The errors seem to have abated. I get maintenance happens, I more just wanted to make sure thats what it was, and not something i did.

1 Like