Summary:
There is a persistent data loss issue when switching between O3 Mini and GPT-4o across multiple devices. Conversations fail to sync in real time, and in some cases, messages from the last 30 minutes are permanently lost. Restarting both apps does not restore missing messages, indicating a potential session persistence or DBMS misconfiguration issue.
This is not just a UI bug but a deeper architectural issue affecting data integrity, session management, and cloud sync behavior. As ChatGPT expands into enterprise use cases—where multiple teams interact with AI agents in real time—this issue could present scalability risks for AI-powered workflows.
This impacts users who rely on multi-model workflows for long-form discussions, leading to workflow disruption, lost context, and potential data integrity concerns.
Steps to Reproduce
Step 1: Start a long-form conversation in O3 Mini on Device A (mobile or PC app).
Step 2: Continue the discussion for 30+ minutes.
Step 3: Switch to GPT-4o on Device B (another PC or phone).
Step 4: Return to O3 Mini on Device A and check if previous messages remain intact.
Step 5: Close and reopen both apps and check whether messages sync properly.
Step 6: Observe whether the latest messages:
- Sync with a delay
- Are missing or truncated
- Are permanently lost after switching models
Expected Behavior
Conversations should sync instantly across models and devices.
Closing and reopening apps should not result in missing messages.
Chat history should be fully retained in OpenAI’s backend, ensuring data persistence.
Actual Behavior
Recent messages fail to sync properly when switching between O3 Mini and GPT-4o.
Closing and reopening the app results in messages from the last 30 minutes being lost.
Multi-device, multi-model workflows do not retain session history, leading to data fragmentation.
Severity: High (P2) – DBMS Misconfiguration & Session Handling Failure
This is a data loss issue affecting AI session integrity.
Users who rely on AI for long-form discussions, research, security testing, or enterprise workflows cannot trust session continuity.
If conversations are not properly stored before syncing, this could indicate deeper issues in OpenAI’s DBMS replication or caching.
How OpenAI Can Verify This
Check backend session logs for inconsistencies when switching between models.
Analyze whether AI-generated messages are being cached or written correctly before sync requests.
Test session persistence when switching devices, focusing on token handling in long-form AI conversations.
Enterprise-Level Risks (Why This Must Be Addressed Now)
As ChatGPT expands into enterprise environments, where AI agents act as persistent team members, this issue presents serious risks:
Enterprise AI Agents Need Reliable Syncing
Engineering, security, compliance, and legal teams will query AI models for role-specific insights.
If sync fails across devices, decision-making could be delayed or compromised.
Multi-User Collaboration Relies on AI Memory
If multiple employees interact with the same AI agent across devices, they need continuous session history.
Losing AI-generated insights mid-conversation disrupts workflows.
Security & Compliance Risks in AI-Powered Teams
Regulated industries require AI-generated records to be audit-ready.
If conversations fail to sync or are lost, organizations could face compliance risks.
AI Agents Will Act as Persistent Team Members
In enterprise setups, AI-powered assistants must store and retrieve historical context reliably.
If AI memory retention fails, enterprise adoption will be severely impacted.
My Usage & Why This Bug Affects Power Users
I am a high-frequency user engaging in deep, long-form conversations daily for research, security testing, and AI-driven workflows. My interactions often span multiple AI models (O3 Mini, GPT-4o) and multiple devices (same model). The other device is considered as voice chat sometimes and reflects as Voice Chat Ended on ChatGPT.
This bug significantly impacts users who rely on ChatGPT for structured thought processes and real-time multi-device interactions. If messages disappear without warning, it reduces AI’s reliability as a persistent workspace tool.