Embedchain with o4-mini. Error around max_token

Hi guys, has anyone tried to call o4-mini API with embedchain? I have been trying to do that but keep getting some strange error messages.
This is the error messages I am getting:

Blockquote BadRequestError: Error code: 400 - {‘error’: {‘message’: “Unsupported parameter: ‘max_tokens’ is not supported with this model. Use ‘max_completion_tokens’ instead.”, ‘type’: ‘invalid_request_error’, ‘param’: ‘max_tokens’, ‘code’: ‘unsupported_parameter’}}

And this is the config.yaml (part of it) I am using

Blockquote llm:
provider: openai
config:
model: ‘o4-mini’

As you can see, there is no max_tokens in my config.
Has anyone faced similar issue? Or know where I have gone wrong?
Tks!

1 Like