Error: The requested model 'xxx' cannot be used with the Assistants API

I wanted to play around a little bit with the new Assistants API in my Python code. I have a finetuned GPT3.5-turbo model and wanted to use it with the Assistants API, but I get:

The requested model ‘ft:gpt-3.5-turbo-0613:personal::xxxxxx’ cannot be used with the Assistants API

I tried to finetune a new model, but same error. Reading the documentation it says, one can use Assitants API on all models, even finetuned ones.

Anything I do wrong or can do?

The complete error is

“openai.BadRequestError: Error code: 400 - {‘error’: {‘message’: “The requested model ‘ft:gpt-3.5-turbo-0613:personal::xxxx’ cannot be used with the Assistants API.”, ‘type’: ‘invalid_request_error’, ‘param’: ‘model’, ‘code’: ‘unsupported_model’}}”

4 Likes

Same problem here, seems that the API is not up to date with the documentation.

1 Like

Same problem for me. Is there a plan to fix this or can you just not use a fine tuned model?

1 Like

I need to be able to combine those 2 as well. Please implement this.

2 Likes

+1 same need here. This would be super helpful