Is there a plan to reduce payment for repeated input tokens?

where did you hear that?

My current speculation (and it could very well be very wrong) is that input token price is a pseudo-standin for vram leasing (Insights on ChatGPT Enterprise Using GPT-4-1106-Preview Based on Context Length Specifications - #2 by Diet)

If that is the case, then it wouldn’t make sense for them to remove that cost.

The only reason I would see them offering that is if they really really wanna push assistants. But even if they hypothetically did that that, it would be a trap, and I don’t think you should fall for it :thinking: