System message utilizing compression

Does anyome know if there is a character limit in the system message? There’s people out there right now finding ways to compress text by asking GPT4 to do it, I’ll post the prompt below. If we could compress contextual character data into the system message, giving it the maximum amount of context, I feel like it would only increase the system message’s potency when trying to design a character to interact with (for Pathfinder in my use case).

Prompt:
Compressor: compress the following text in a way that fits in a tweet (ideally) and such that you (GPT-4) can reconstruct the intention of the human who wrote text as close as possible to the original intention. This is for yourself. It does not need to be human readable or understandable. Abuse of language mixing, abbreviations, symbols (unicode and emoji), or any other encodings or internal representations is all permissible, as long as it, if pasted in a new inference cycle, will yield near-identical results as the original text:

Video of someone using it successfully, but not in a system message yet: https://twitter.com/mckaywrigley/status/1643593517493800960

GPT-4 currently has 8k tokens max in+out, and has plans for 32k max in the future. If there is a compression it understands, sure, use it, but it is probably hallucinating, so beware. Publish here if not! :smiley:

1 Like

Is that 8k limit for the system message alone? Like I could fit 8k worth of tokens into the system message? I appreciate your time.

8k is everything. System, user, assistant, output.

1 Like

Got it, I’ll post here if I am successful!

1 Like