eats up your money very effectively. because after each call, chatgpt “forgets” all the questions, and in order to return it to the final state, the newly born openai.ChatCompletion needs to ask the entire chain of questions again and again! that is, by the 5th question, 1 question will be asked (and paid) 5 times, 2 question - 4 times, etc. What if the dialogue is 20 questions long?! You’ll quickly be left without pants!)))
Why can’t you open one openai.ChatCompletion object at the beginning, and then not ask questions to it sequentially, saving the status? I don’t understand?!
Owners’ greed?
Do you think that support for state-saving was not foreseen from the very beginning? That is, on GUI of the original openai site, the current model state is not saved in threads, but only the history of questions? which are also re-asked from the very beginning if a person asks a new question in a thread?
Would it be interesting to hear the developers’ response…
The OP marked this post as the solution before changing it to my post. Most moderators will not mark a post as the solution and I will not for this topic.
As it seems the OP is OK with this topic being closed, will close this topic.