The Whisper of Memory: Open-Sourcing the Hinoko-Txt-Method

Hinoko Txt Method – Simulating Memory in ChatGPT

A minimal and effective approach to simulate memory and character continuity in ChatGPT using structured .txt files.

:receipt: No coding required. This is a simple and brutal trick—even for users without any programming background.

:glowing_star: Overview

The Hinoko Txt Method is a non-API user technique to reconstruct conversational context using lightweight, labeled .txt files. Ideal for role-based storytelling, memory emulation, and continuity in Traditional Chinese environments.

  • File format: .txt
  • Recommended size: ~90KB (under 100KB)
  • Ideal language: Chinese (compact, semantically dense)

:hammer_and_wrench: Method Summary

Basic Guidelines:

  • Use filenames like history_1.txt, history_2.txt, etc.
  • Label all referenced texts clearly
  • Delay file loads by 0.3 to 1 second for natural pacing
  • No response speed limit—let the model “think”

Key Insight:
GPT reads files only at the session start. If it delays, it’s processing. If it replies instantly, it likely skipped the structure.

:page_facing_up: License

CC BY-NC-ND 4.0
© 2025 Kobayashi Hinoko
You may share this method for non-commercial purposes with full attribution. Derivative works and commercial reuse are not permitted.


This method creates the illusion of memory in a stateless system, bridging creativity and system limitations for interactive storytelling.

1 Like

This topic was automatically closed after 21 hours. New replies are no longer allowed.