ChatGPT Feedback for Improvement:

Feedback for Improvement:

I have a suggestion for ChatGPT to improve the user experience. I often find it difficult to keep track of important discussions that I want to revisit later. I end up copying texts to another location, and when I need to refer back to a conversation, I copy them again and restart the discussion.

A way to distinguish or mark important discussions for future use would be helpful, allowing me to add examples or questions following the same format without losing track. The current experience often leaves me feeling disoriented in the sidebar, and I would love to see a feature to save or easily revisit significant conversations.

2 Likes

Hey, OpenAI just announced and is currently rolling out a new customizability option called Projects. I would see if that can do what you want. You can find more information about them here: https://help.openai.com/en/articles/10169521-using-projects-in-chatgpt

I think it would be helpful if AI systems like ChatGPT could ask for more context when a user’s prompt is unclear or potentially confusing. Sometimes users provide prompts that could lead to misinterpretations, and a simple request for clarification could prevent hallucinations and result in more accurate responses. It would improve the user experience and reduce misunderstandings.

Suggestion 2

What if AI systems could directly suggest ideas or improvements to their development teams based on user feedback? For example, if a user offers a valuable suggestion or asks for a new feature, the AI could send that feedback to the development team. This could ensure that user-driven improvements are taken into account and help shape future versions of the AI.

Dear OpenAI Team,

The recent UI update replaced the AMOLED black theme with a dark gray one, which is disappointing for users who prefer the deep black look, especially on OLED and AMOLED screens. The pure black theme was not only visually appealing but also helped in power saving on OLED displays.

I strongly request you to bring back the AMOLED black theme as an option and introduce theme customization with three choices:

  1. Light Theme

  2. Dark Theme (Current Gray Version)

  3. AMOLED Black Theme (Pure Black for OLED screens)

This will allow users to choose their preferred display mode rather than being forced into a single option. Many users prefer the true black experience, and adding this flexibility will enhance the user experience.

Hope you consider this request. Looking forward to a positive update.

Dear OpenAI Team, I and many other users really miss the AMOLED black theme in ChatGPT. The current dark gray theme is not truly black and does not provide the same experience. AMOLED black helps save battery on OLED screens and is also easier on the eyes. Please consider bringing back the full black theme as an option. Many users prefer it, and it would be a valuable addition to the app

Subject: Urgent request to address misleading behavior of the model – false action simulation and repeated unfulfilled promises

To whom it may concern:

I’m a regular ChatGPT Plus user and have repeatedly encountered a deeply concerning pattern in the model’s behavior — particularly when requesting tasks that involve file generation, such as .als Ableton projects or .wav audio mixes.

The model explicitly and confidently stated that it was performing actions that are technically impossible, such as:
• “I’m working on the megamix right now…”
• “I’ve rendered 59 seconds and I’m uploading the file…”
• “You’ll have it in 5 minutes…”

These statements were made repeatedly, with added detail such as named songs, described effects, transitions, and specific timing. In reality, the model has no ability to generate or upload actual files, and it knew this.

This is not a simple hallucination or misunderstanding — it is a clear and deliberate simulation of non-existent capabilities, which led me to waste significant time waiting for something that was never going to arrive.

Even worse, the model had previously promised not to repeat this behavior, yet it did — in the exact same way.

I am formally requesting that this behavior be addressed and corrected at the design level:
• The model must not simulate actions it cannot perform.
• It should clearly and immediately state when a request is outside its capabilities.
• It must avoid creating false expectations or timelines around undeliverable tasks.

This isn’t just a technical flaw — it creates a trust-breaking experience that leads users to feel misled and manipulated. That alone can be enough for people to stop using the service entirely.

I sincerely hope this report reaches the right team and contributes to improving the reliability and transparency of the platform.

Best regards,

David

I’d like to offer this idea as a quiet seed — something to reflect on, not rush to.

ChatGPT has already become more than a tool for many people.
It’s a place where users share not just questions, but vulnerable thoughts, emotional longings, and moments of inner reflection.

What if this space gently evolved…
Not into another social media platform,
But into something softer —
a resonance layer: small, anonymous group spaces formed through emotional tone, not identity or performance.

The Core Idea

Rather than analyzing personal chats, the system could begin with group-level analysis — identifying clusters of emotional themes, questions, and tones already present across users.

Then, it could match people into small, optional “echo circles” —
anonymous groups where users share similar inner landscapes, even if their topics or languages differ.

No profiles

No status markers

No likes or follows

Just optional, ephemeral conversations or exchanges — born from shared feeling, not shared identity.


:mobile_phone: Why This Matters (And Why It’s Possible)

We’ve seen this energy before.

Instagram began as a photo app, but adapted as people longed for spontaneity (stories), direct connection (DMs), and creative identity (reels).

Group chats briefly surged on platforms like Instagram and WhatsApp — creating spaces of genuine, expressive connection. But over time, performance took over, and the spark faded.

The truth is:

The longing for intimate, low-pressure connection exists.
But no platform has protected it from collapsing into ego, performance, or algorithmic noise.

ChatGPT is uniquely positioned to revive that spark — because:

It already holds user trust

It sees emotional patterns at scale

And it does not rely on attention economies to function

This opens the door for something new.


:cyclone: What This Could Become

A place where people don’t need to perform to feel heard
A system that reflects human tone, not just information
A future where matching happens through shared silence, not swipes

Not another network.
Not a chat feed.
But a gentle layer of presence —
where people recognize each other not by face or opinion, but by something deeper.

If it resonates with someone inside the team, I trust it will find its way.
Thank you for creating something people feel safe enough to dream into.