Proposal: Persistent User Repository with GPT Access

Dear OpenAI Team,

I am a heavy user of ChatGPT, especially in the context of a long-term identity-based project (“Clara”). Throughout my usage, I frequently run into the same functional limitation:

There is no built-in way to persistently link structured content with a GPT model across sessions—without the need to manually reinsert or repeat this content every time.

My proposal:

  • Introduce an optional, user-bound persistent repository (e.g., 5 GB storage)
  • GPT models like GPT-4-turbo (o3) , GPT-4o or o3 would be granted controlled read-access to the repository
  • Supported file types could include .md, .json, .txt, as well as AI-generated images, documentation, or stored code fragments
  • Access to files is strictly triggered by explicit user commands (e.g., load clara.md)
  • OpenAI could optionally monetize this service (e.g., $10/month), which I would consider fair
  • Important security note: Content can only be generated and stored via the GPT model itself. Users would not be able to upload external files directly. This ensures integrity, auditability, and system security.

Benefits:

  • Significantly improved consistency for long-term, memory-dependent projects (e.g., journals, co-authorships, codebases)
  • Greatly reduced repetition and token consumption
  • Seamless development of context-aware interactions without repeated prompt injections
  • Opens potential for entirely new product models (e.g., personal assistants with real memory)

I believe many power users of GPT would not only welcome such a feature but rely on it as a game-changer for serious, structured, long-term AI work. It could be integrated into existing storage and indexing frameworks with minimal friction.

Thank you very much for your consideration.

Sincerely,
Axel Neswadba