It will be much better , when chat gpt can remember about all previous chats

I just asked a related question: Strategy for chat history, context window, and summaries

@ruv, you summarize on every interaction?

@PJK can you provide additional details on how it improves the results?

@jochenschultz As a native Hebrew speaker I support vowel removal :slight_smile: , but I’m surprised getting a reduction in the number of tokens. In fact, I just checked:

gish tell me a joke about a bike --no-stream
Why did the bike fall over? Because it was two-tired!
Tokens: 29 Cost: $0.00006 Elapsed: 0.83 Seconds
gish tll me a jke abt a bke --no-stream
Why did the bicycle fall over? Because it was two-tired.
Tokens: 33 Cost: $0.00007 Elapsed: 0.823 Seconds

Maybe on larger texts it works, but I doubt it.

1 Like