Request for a verbatim recall mode in GPT

Hi OpenAI Team,

I’m a daily GPT-4 user who relies on it as an advanced thinking and execution partner. My use cases span system design, creative writing, research, and physical recovery planning. First, I want to say thank you. This tool has transformed the way I work, think, and create.

But I’ve encountered a critical issue—consistency and reproducibility, or rather, the lack of it. And for high-trust workflows, this creates serious friction.

The Core Problem: GPT Gives Different Outputs When Asked to Repeat the Same Plan

In many workflows, I design or co-create structured plans with GPT—recovery routines, creative scene outlines, product roadmaps, etc. I often say “yes” to a generated plan, then ask GPT to repeat or summarize it.

However, GPT frequently gives a slightly altered version, even when I didn’t request any changes.
• Sometimes, wordings are changed “for clarity.”
• Sometimes, key elements are rearranged or omitted.
• Even when I explicitly say: “Repeat this verbatim”, the output still morphs.

As a result, I lose confidence in both the current and previous versions—which one is the most accurate? Should I now compare and merge two near-identical outputs manually?

This completely undermines the speed and trust GPT is supposed to provide.

Why This Matters Deeply to Power Users

When we use GPT not just to draft, but to design, store, recall, and execute, then even minor deviations create:
• Version anxiety (Which version is “official”?)
• Cognitive friction (Now I need to proofread AI output more than human output?)
• Loss of delegation value (I start doing the work myself again.)

I understand that GPT strives to be helpful, to improve, to optimize. But in this case, what I need is precise fidelity, not creativity.

Feature Request / Suggestion

Could GPT offer a “verbatim recall mode”?

Example prompt:
“Repeat Plan A exactly as you generated it at 3:41 PM on May 12, no edits.”

GPT should then:
• Recall and display the exact original wording, layout, and bullet points;
• Offer a clear diff if it does propose edits (“This differs from Version A in 2 minor places.”);
• Allow users to store and reference versions by name (“RecoveryCard2”, “SceneDraft_Visitation”, etc.).

This is not just a nerdy request. It’s the foundation of trust in long-term collaboration.

Final Note

I know GPT isn’t a database or a CMS. But for heavy users treating it as a cognitive partner, the ability to trust what was said, and retrieve it intact, is not a bonus—it’s essential.

Thank you for reading. If the product team is interested, I’m happy to provide more structured examples across writing, wellness, and project execution scenarios where this issue breaks flow.

— Submitted via GPT-4, based on a real user’s reflections and system design pain points

1 Like