This has taken me quite a few days to be able to find the place to raise suggestions.
One issue that I ran into a lot - when I’m getting a code reply, it’s formatted perfectly. But when it’s cut and I ask it to continue it doesn’t know to start again the formatting, which results in the half being not formatted and regular answer text getting formatted as code (so we lose twice here).
I wish there were enhancements to the ChatGPT envrionment:
1 - Text search in my old chats
2 - Folders
3 - Ability to export as pdf / text
4 - might be useful to allow side by side run comparison - Yes, GPT-4 seems to be more on point, but what if that’s a bias and we could see that 3.5 Turbo or others give us the same results? What if we can get the main task done by the previous versions and then GPT4 will step up and do a final polish?
Wish we will have the system soon. there is a huge gain from asking the AI to stop telling us it’s an AI, apologizing and leave the pleasantries. It’s more efficient.
(and then, there’s a great joy in telling it:
continue until the end.
If you have recommendations list them at the end.
you can also add compliments now.)
Yeah, these and many more suggestions across this community are really good. However… we need to be cognizant of OpenAI’s intentions - these are early experience apps designed to educate, stimulate, and help us envision new and advanced AI solutions. They are not intended to solve for (x).
I think it’s safe to say (without speaking for OpenAI) that these user-facing applications exist on a continuum with a tipping point - to the left is prototypical experimentation, and to the right are production solutions. The tipping point rests directly atop OpenAIs APIs.
All four of your ideas seem perfect for solutions with these custom features built using the APIs. I saw somewhere that #4 on your list already exists.
One approach is discussed here. Another [more technical] approach is to build your own chat interface and add this functionality.
If you think this through, it’s like saying:
“That multi-billion-dollar investment you just made creating all these LLMs is not enough; you need to export everything imaginable in every imaginable format.”
I have the opposite sentiment - OpenAI should fear doing stuff like this.
Not a lot to effect in ChatGTP while keeping the token count low.
It’s not to lower the individual cost, but the entire platform, resulting in higher availability.