Hi Lior, and welcome to the OpenAI community!
Yeah, these and many more suggestions across this community are really good. However… we need to be cognizant of OpenAI’s intentions - these are early experience apps designed to educate, stimulate, and help us envision new and advanced AI solutions. They are not intended to solve for (x).
I think it’s safe to say (without speaking for OpenAI) that these user-facing applications exist on a continuum with a tipping point - to the left is prototypical experimentation, and to the right are production solutions. The tipping point rests directly atop OpenAIs APIs.
All four of your ideas seem perfect for solutions with these custom features built using the APIs. I saw somewhere that #4 on your list already exists.
One approach is discussed here. Another [more technical] approach is to build your own chat interface and add this functionality.
If you think this through, it’s like saying:
“That multi-billion-dollar investment you just made creating all these LLMs is not enough; you need to export everything imaginable in every imaginable format.”
I have the opposite sentiment - OpenAI should fear doing stuff like this.