I tried to inquire about enterprise, which I know how much it costs, it’s expensive, but I also know they don’t mess with it the way they mess with Plus. I expected them to ignore me, but instead they suggested I use teams. So I took the opportunity to tell them exactly what I think of Plus and how I would not pay for higher rate limits for the garbage I’m getting right now.
I see the people complaining about broken rate limits, but I can’t get that far. I woulf pay the teams price to get the ChatGPT-4 I had last week, but this garbage isn’t even useful.
Recently, model brealing problems have included things such as, undeclared variables, the notion of creating a function to do what it was trying to make up variables for, passing a variable on to a module, and executing a module with the same key press that the module initiated. These are basic things, yet even if I manually type the correct code or provide a source reference for context, it could not overcome these challenges.
I wasted over 3 hours trying to get it to help me resolve a problem it caused, unsuccessfully, that caused me to go over the entire system line by line before I found out it had subtly changed the order of arguments passed into a module and even looking over the module and the script that launched it, it continued to do so repeatedly. They were close arguments, so visually I didn’t notice when it did it, but as an AI model, it should have known that. Like that’s such basic stuff. You can’t declare a module with an order of arguments passed then change it when you initiate the module, it crosses the data stream so you end up with switched variables.
I tried to simplify what I asked of it. Only asking it for help with smaller code blocks or one function at a time, it can’t even do that. It can’t get syntax right and now all thr sudden it keeps trying to use deprecated code. Like last night I told it, “you can’t load an animation that way, that’s deprecated and doesn’t work anymore, this is how you properly load an animation… [provided one of the lines that was wrong] then [gave example of how it should load]”…
ChatGPT-4 "You’re right, I apologize, here’s the corrected version without using any deprecated code…[proceeds to provide the exact same deprecated code I just told it was wrong].
I understand for most people using ChatGPT-4 for writing, these kinds of changes aren’t as impacted, though I imagine many still are. However, when you use it for code, having it fluxuate between very useful one day to totally useless the next is infuriating.
One day it is helping me handle a bigger workload and the next using it slows me down so much I can’t keep up with my workload because it has become a burden so I’m stuck with a workload I wouldn’t have taken on had I known it would suddenly become trash.
I know this particular platform has a deal with OpenAI, and it’s not a small deal. So that company has provided OpenAI with nearly all reference data that exists to train with. That includes their knowledge base on things like what code is deprecated. There’s no goid reason for ChatGPT-4 to be completely broken by these basic things, especially when they deprecate slowly, so the stuff that’s actually broken and won’t work is usually years old stuff so you know it’s not a matter of their training data being too old. The stuff it was doing last night was deprecated before ChatGPT even existed.
What makes it worse is everything I’m doing now, I’ve done with ChatGPT since 3.5 Turbo. Once they came out with 3.5 Turbo 16k, the API didn’t have these issues. It feels like the only solution is to update my chatbot and see if I can get the API to be more useful. I just need to make a feature to save conversations and to update the conversations with context. Right now all I have is the option to copy chat (so I can paste it as context), or copy selection so I can get the code out of it without the whole chat. I haven’t even looked recently at changes in the API to see what options there might be.