I would like to request a feature that introduces a flexible pricing model for the paid version of ChatGPT, based on the resources consumed by users. Currently, there is a single subscription plan available, which may not cater to the needs of users requiring higher processing power or those who want to retain extensive conversation logs in a single thread. This feature would allow users to pay for additional resources based on their specific requirements, enabling them to access increased processing power, generate more code, and retain comprehensive conversation logs without limitations. By offering different pricing tiers based on resource consumption, users can choose a plan that best suits their needs and enjoy an enhanced level of service. This flexible pricing model would provide users with greater flexibility and value, while allowing OpenAI to meet the diverse needs of its user base.
I’m unclear as to what you are hoping for here, ChatGPT is already unlimited use for paid subscribers, maintains chat histories, and there is not a limit I’ve found on the length of history or keeps in each thread.
Have you looked at the API? You can access it via the playground or connecting your API key to some of the third party tools like Chatblade (but be careful with API keys). It charges you based on the amount of tokens used, both in the prompt and response.
Back in the day, it was substantially cheaper to use it over subscribing for ChatGPT 3.5. But the API costs for GPT-4 can often exceed the one given by the subscription.
Version 4 is limited. The conversations are reverted to version 3.5 when the limit is exceeded.
I don’t have the knowledge to use an API. I just use the website for everything.
I’m aware of that.
3.5 is the product. GPT-4 is not part of ChatGPT Plus, it’s just something they give to plus subscribers as a bonus.
ChatGPT ≠ GPT-4
It never did. It’s something they are allowing us to try.
They may eventually decide to update ChatGPT to be GPT-4, but to the best of my knowledge they have not as of yet done so.