How can codex-cli be used with oss models in LM Studio?

I tried to configure it like this

[model_providers.lms]
name = "LM Studio"
base_url = "http://localhost:1234/v1"
[profiles.gpt-oss-120b-lms]
model_provider = "lms"
model = "gpt-oss:120b"

But there is never a response

1 Like

I tried this with gpt-oss-20b running on LM Studio, but it won’t work locally. Codex doesn’t edit local files and won’t create a README.md. I press enter after typing my prompt, but Codex skip answering and immediately goes back to another prompt shell.

1 Like