Longer context limits. Shall we expect it at some point in the future?

Hey there. According to the blog post, nowadays it is already possible to access dedicated Azure instances with longer context windows for the models. So the algorithms are ready for that (amazing work, to be honest). However, accessing these instances seems to be prohibitively expensive for independent developers (we’re talking about spending 450M tokens per day or more).

In my view, there are some use cases that cannot be really addressed without these context windows. In my case, not only working with huge sequences of text documents but mainly doing it with codebases: formatting scripts using ChatML syntax is expensive in terms of tokens consumption and one single script basically consumes all my context window, making it impossible to have a conversation about it (or include other relevant scripts in the conversation).

Therefore, the question is about future expectations: can we (independent developers) expect to be able to access these larger context windows at some point in the near future? @logankilpatrick I know that the answer might be very likely “no”, but is there any chance that you can tell us anything about this? Thanks a lot guys. Keep up the amazing work that you’re doing for all of us :slight_smile: