Request for Update on GPT-4-32k (or 16k?) Release

What surprises me is that GPT-4 context length is 8k yet the more affordable GPT-3.5 context length is still 4k. It’s not like GPT-4 will write over 1000 tokens without a jailbreak now anyway. Why not at least bump it down from 8k to 4k?