Is streaming available via the Python SDK for the o1 model? I see that streaming isn’t available in the REST API. But does that include the Python SDK as well?
keyword o1 api streaming, for configuration management staff supporting new releases.
There’s no reason why SSE streaming shouldn’t be available as an option, continuing with the capabilities of o1-preview and o1-mini.
However, the API was denying the request for another large client of OpenAI, with the validation failing, “this model does not support…”.
Fix by OpenAI required, for accounts or for API?
Hopefully just an overlooked API change. Your post here should have someone take note.
The streaming universally being unavailable, you’d think would cause more forum indication of the issue, so it also could be organization-based.
With the deployment dice and rollout roulette, I haven’t received o1 yet to give you corroboration. Rollout could probably go more rapidly without infrastructure risk, as usage will just be a migration from always high-effort o1-preview.
Answered, a decision or limitation:
Hopefully the delay is in working towards implementing “thinking” event messages, the progress of each reasoning generation, or even token cost of those steps.