Can't setup Codex CLI with custom base url and api key, via terminal env variables or command options?

The docs are not clear…

(and assuming I am in a docker container, where would Codex CLI look for ~/.codex/config.toml given that there is only /workspace ? anyway, the easy solution is terminal env variable or Codex CLI command options)

1 Like

Welcome to the dev forum @editoReview

As far as I understand, Codex reads configuration from $CODEX_HOME/config.toml. If CODEX_HOME is not set it defaults to ~/.codex . In a container the HOME directory is whatever the $HOME environment variable points to (often /root or /workspace), so Codex will look for $HOME/.codex/config.toml. If you want the config in /workspace, set CODEX_HOME=/workspace/.codex before running the CLI, or create /root/.codex/config.toml inside the container.

The CLI also reads variables from a .env file in the current directory; the README notes that you can place your API key in a .env file and the CLI will automatically load it.

1 Like

Hi it says below… still confusing (if you never tried it yet), it would really cost nothing to the docs-writers to include some examples…

so for example, if I don’t want to deal with config.toml then is it correct that I should do this?

export MYPROXY_API_KEY="your-api-key-here"
export MYPROXY_BASE_URL="https://your-provider-api-base-url"
codex --provider MYPROXY "my prompt"

or I should instead do something similar but with MYPROXY string replaced by OPENAI ?

Are you guys deliberately keeping this undocumented…


FROM DOCS:

Quickstart

Install globally:

npm install -g @openai/codex

Next, set your OpenAI API key as an environment variable:

export OPENAI_API_KEY="your-api-key-here"

Note: This command sets the key only for your current terminal session. You can add the export line to your shell’s configuration file (e.g., ~/.zshrc) but we recommend setting for the session. Tip: You can also place your API key into a .env file at the root of your project:

OPENAI_API_KEY=your-api-key-here

The CLI will automatically load variables from .env (via dotenv/config).

Use --provider to use other models

Codex also allows you to use other providers that support the OpenAI Chat Completions API. You can set the provider in the config file or use the --provider flag. The possible options for --provider are:

  • openai (default)

  • openrouter

  • azure

  • gemini

  • ollama

  • mistral

  • deepseek

  • xai

  • groq

  • arceeai

  • any other provider that is compatible with the OpenAI API

If you use a provider other than OpenAI, you will need to set the API key for the provider in the config file or in the environment variable as:

export <provider>_API_KEY="your-api-key-here"

If you use a provider not listed above, you must also set the base URL for the provider:

export <provider>_BASE_URL="https://your-provider-api-base-url"

Run interactively:

codex

Or, run with a prompt as input (and optionally in Full Auto mode):

codex "explain this codebase to me"
codex --approval-mode full-auto "create the fanciest todo-list app"

That’s it - Codex will scaffold a file, run it inside a sandbox, install any missing dependencies, and show you the live result. Approve the changes and they’ll be committed to your working directory.