O1-mini models and streaming results in error 400

Hi all :wave:
When testing the new o1-preview and o1-mini models, if you pass the stream: true parameter an error 400 occurs.
Has anyone had this problem? If yes, how did you solve it?

1 Like

The o1 models are currently in beta and do not support streaming, yet.

You can read up on this in the docs for the new model

https://platform.openai.com/docs/guides/reasoning/beta-limitations

3 Likes

Thanks so much for the help, didn’t notice the first time I read it

1 Like