Feature Request: Let Codex Build and Embed Small Task-Specific Local Models

Hello OpenAI team,

I would like to request a Codex feature for local-first software development.

Feature idea:
Allow Codex to optionally prepare, adapt, quantize, and integrate a small task-specific local model into the applications it builds.

Example:
If I ask Codex to build a debt management desktop app, it should be able to optionally prepare a lightweight local model for that app’s specific use case and integrate it into the generated project so core AI features can work offline.

Why this would be valuable:

  • offline and privacy-friendly app experiences
  • less dependency on external APIs for end users
  • more useful generated apps
  • stronger local-first developer workflows
  • better support for specialized software

Important note:
This could be controlled by plan limits or compute limits, so higher tiers could support larger local model workflows while standard paid tiers support smaller task-specific models.

What I’m asking for:

  • optional local model preparation inside Codex
  • optional task adaptation / lightweight fine-tuning flow
  • optional quantization/export flow
  • ability to integrate the resulting lightweight model into generated desktop or web apps
  • clear controls for model size, runtime limits, and deployment target

This would make Codex much more powerful for developers building real local-first software.

Thanks.