"Move context to new conversation" capability

Conversations can get pretty long, with a lot of prompts and answers to iterate on different parts of complex topics.

But once the conversation gets too complex, it is very hard to navigate through the past answers, as if only the last was important (which is not true at all: we users usually iterate on the same conversation to NOT LOOSE CONTEXT).

To have smaller conversations but not loose context, we need a “move context to a new conversation” capability.
Unsure what would happen behind the scene (summarize the current convo as a primary input to new conversation’s prompts?..) but the feature would be extremely useful.

2 Likes

I was going to post this as well, but halfway through posting I found this one. I second this :+1:

Here’s my ramblings based on the post I almost posted:
As a neurodivergent who loves to arbitrarily sort, the ability to categorize absolutely everything is a feature I always really like in software.

In Discord you have the features ‘forwarding’ and ‘threads’. These would translate pretty well in ChatGPT’s UI, i think.

  • A forward-like feature would be moving your most recent message(s) and ChatGPT’s replies to a different conversation for when your conversations go off-topic

  • A threads feature could just be achieved with allowing the user to make sub-conversations, and optionally an option for auto-archiving conversations over time, to keep the list of conversations clean.

    • This could also act more like discord’s threads, where you create threads in conversations, and only these expire to be archived after a while. Functionally these two are almost the same, it just depends how you wanna show it in the UI, and if you want to auto-archive conversations or not.

Most of this could be front-end UI work, but some back-end optimizations could be paired with this as well: If a super specific conversation tangent is held in the context of the general, larger conversation, you are essentially wasting context tokens and thus server resources. Introducing threads could allow the user to go into very specific tangents while unloading irrelevant parts of the parent conversation from memory.

ChatGPT, when queried about a theoretical implementation of threads, suggested this:

Reserve a fixed token budget for:

  • Parent context summary (e.g., 10-15% of the total token limit).
  • Active child thread messages (majority of the tokens).
  • Dynamic reference space (tokens temporarily allocated for injecting parent or sibling context).

A feature to move context, similar to Discord’s forwarding feature, could also save tokens by removing or invalidating the messages that are forwarded. This removes irrelevant information from the initial conversation, making sure only relevant information is in that conversation’s LLM context.

I hope devs consider this :+1:

2 Likes

Might be worth republishing! The first issue didn’t get too far haha

yes i would love this too!