Hi, I tried many ways to add the o3 model to Codex, none of which worked.
Both in the CLI and in the VSCode plugin.
In the plugin, it just ignores my config.toml model configurations,
and in the CLI, I get this error when trying to talk to the agent:
stream error: unexpected status 400 Bad Request: {“detail”:“Unsupported model”}; retrying 4/5 in 1.72s…
I tried searching online and even consulting ChatGPT for this, but came up empty.
Can somebofy help?
Thanks in advance.
Edit: Apparently, as I was searching through the Codex Github issues, I came across an issue which said that you can only change the models with an API key, not through your subscription. Very weird. I hope it will be available for subscribers as well soon.