Yes they are discontinuing support in about 3 days
They recommend you switch to GPT 3.5 Turbo. You will have to swap to the new chat format of sending prompts when you use the new endpoint
Yes they are discontinuing support in about 3 days
They recommend you switch to GPT 3.5 Turbo. You will have to swap to the new chat format of sending prompts when you use the new endpoint
Ohh, the max_token for code-davinci is 8012 but for gpt 3.5 its 4096. Plus fine-tuning is also not an option.
WIll have to think because my prompt size is > 5000
oh wowā¦ so, was I right or was I right?
Iām really pissed off at OpenAI right nowā¦
Yeah, you have an amazing crystal ball, @nunodonato You nailed it!
Do you think gpt-4
will be the model for GitHub Copilot now?
Iām honestly totally lost in understanding this move. I donāt see how forcing us to use āchatā will work well for code completions.
AFAIK the āinsertā mode was used by Copilot, it was great for code completions. Chat doesnāt have thatā¦
Answering the question from the topicā¦
Yes, I still use Codex because:
GPT-4 is the only hope now, but itās expensive.
remember, there is no āgpt-4ā, only āchatgpt-4ā. that sweet sweet conversation when all you want is a few lines of code
In our early testing, it wasnāt too bad to get ChatGPT to just return code: Use ChatGPT instead of Codex for Code Generation | The Inner Join
I think the chat format can actually be really good for the kind of one-shot prompt you showed above, too, because you can send a user message/assistant message pair with the response and format youād like before sending further user messages.
I do wish we had more notice, though, as weāre still using Codex in production and wouldnāt mind a longer timeline to get switched over!
After rereading the full email on this from OpenAI, they may only be deprecating codex models via the public dev API.
However, it may still be possible a different codex model will continue for GitHub Copilot? @logankilpatrick will GitHub Copilot still use a codex version and not migrate to gpt-4
?
Strange, I havenāt received this email yet given how this isnāt even a waitlist - which would make sense.
Here ya go @sps
On March 23rd, we will discontinue support for the Codex API. All customers will have to transition to a different model. Codex was initially introduced as a free limited beta in 2021, and has maintained that status to date. Given the advancements of our newest GPT-3.5 models for coding tasks, we will no longer be supporting Codex and encourage all customers to transition to GPT-3.5-Turbo.
About GPT-3.5-Turbo
GPT-3.5-Turbo is the most cost effective and performant model in the GPT-3.5 family. It can both do coding tasks while also being complemented with flexible natural language capabilities.
You can learn more through:
Models affected
The following models will be discontinued:
We understand this transition may be temporarily inconvenient, but we are confident it will allow us to increase our investment in our latest and most capable models.
āThe OpenAI team
Ask yourself why their latest models are all locked into the chat api when the only difference between that and the normal text completion endpoint is that we have less control over how it behaves.
I suspect their ultimate goal is to phase out text completion entirely so they can put more guardrails around what we can do with it in their never-ending pursuit of providing a safe and unusable product.
Iām slightly frustrated in not having the freedom to prompt without following a strict structure as well.
However, ChatML so far has done everything that Iām looking for, and it also prevents prompt injections - which is huge. It also helps with formatting which is nice. I havenāt needed a stop sequence since using it.
I imagine theyāre dropping it simply because they donāt want to be supporting it alongside ChatML when ChatML ideally does it all better, and safer. Of course, being a part of the āin-betweenā of something thatās actively being developed will have some issues. All part of the ride Iād say.
@semlar I fear the sameā¦ and the fact that they are not even clear about their roadmaps regarding these models makes me very uncomfortable with developing with them (same with codex). Right now I have been fine-tuning curie models for a project, if they pull the plug on this ability in a few months, what happens?
Today, for the first time, I started to seriously look into alternatives and signed up for the Claude waiting list.
Email notice today: "On March 23rd, we will discontinue support for the Codex API. All customers will have to transition to a different model. Codex was initially introduced as a free limited beta in 2021, and has maintained that status to date. Given the advancements of our newest GPT-3.5 models for coding tasks, we will no longer be supporting Codex and encourage all customers to transition to GPT-3.5-Turbo.
About GPT-3.5-Turbo
GPT-3.5-Turbo is the most cost effective and performant model in the GPT-3.5 family. It can both do coding tasks while also being complemented with flexible natural language capabilities."
I think youāre looking at this the wrong way.
This is similar to when text-embedding-ada-002 replaced 5 separate models because it performed as well, or better. I also use lesser models such as Ada for my own work. I would be blown away if they decided to completely remove them without some sort of way to transfer our training.
Itās best to follow what openai.com is compelled to do with whatever its investment partner, Microsoft does regarding your post.
Microsoft ā>github newsā>co-pilotā>Codex.
Personally Iām appalled at the entire monopoly and am looking at NVidea.
Iāve been using GPT-4 for coding since Friday, and I think I may revisit Codex.
GPT-4 is like a āCoder Buddy,ā but you DO need to know what youāre doing.
It tried to get me to drop EVAL statements in my code last night until I pointed out security heh
That said, GPT-4 has been a super useful tool for me. You just have to know what information to give it and how to ask questions. Itās really sped up my dev time, and Iām not a real coder.
All that said, Iām thinking of firing up Codex again because Iāve been hearing good things. I donāt code a lot, but when I do, I like to get in the zone and get it done. GPT-4 has helped immensely with that.
ETAā¦ and I just read more of the thread and realized Codex is going away? hahaā¦
funny, I just hit the exact same roadblock last night. Tried to get it to write some code and it just refused. Even after I said it was just a local tool and there were no security issues.
Some HAL9000 sh*tā¦
Sam Altman tweeted an hour or so ago, that based on feedback, they have decided to keep supporting the Codex endpoint for āresearchersā.
There is no indication what āresearchersā means at this time.