I was using the codex-mini-latest without issue at one time days back. I try to use today in codex cli and get:
system
⚠️ OpenAI rejected the request (request ID: req_74bdbb20c6fb85df5e7ffb67xxxxxxx). Error details: Status: 404, Code:
unknown, Type: invalid_request_error, Message: 404 This model is only supported in v1/responses and not in
v1/chat/completions.. Please verify your settings and try again.
Yes, I answered my own question. As I stated I do not know how to fix and cannot find documents to do so. I would not intentionally ask a foolish question merely to be ridiculed. Trying to use a classroom tutorial.
What the message is referring to is the difference between the Chat Completions API endpoint that has been around for two years and the newer Responses API endpoint.
It would seem an impossible oversight that OpenAI would have made a computer coding API model exclusive to Responses, and then have the primary way to use it not be via the correct endpoint. I would update to the latest Codex CLI SDK code and see if that doesn’t solve the woes seen here.
Thank you for your assistance. Perhaps I have not made my issue clear:
“I was using the codex-mini-latest without issue at one time days back. I try to use today in codex cli and get the error”
I can no longer use “codex-mini-latest”. I can use the default o4-mini without issues. I have full access to models as we are a verified organization (school education account).
Using codex --version: OpenAI Codex (research preview) v0.1.2505172129. This seems to be the latest and shown after running npm install -g @openai/codex