Going quadratic is by design. Everybody has free research grants to run AI that can call itself with little limitations, and have $20 “chat with my PDF” sessions you were all asking for. Or so one would think with Assistants. “Beta” should go back to “alpha” for some re-engineering.
At the same time, you want knowledge from within 5MB or 50MB? All that text will need some kind of search indexing before some can be used to to fill the model for answering regardless, and OpenAI is going to make that AI-powered for highest effect.