"This model's maximum context length is 8193 tokens" Does not make sense

That is unusual, possibly some server side issue while modifications are made since the recent announcements .