Have been encountering a few ‘You requested a model that is not compatible with this engine.’ errors since last night when calling the gpt-4 API via python.
Haven’t seen them since the days of da-vinci002. Anyone else having the same problems and any idea why they might be cropping up again ?
I would change the generic gpt-4 model to point directly at the model name gpt-4-0613.
It is possible there are some shenanigans afoot in messing with the older model and what actual unpublished revision it points to. That could affect the alias differently than the model name that includes a version number.
No i have not. I’m still working with the older version by choice, but to your point, the v1 has been out since the dev conference and have not encountered this error until yesterday, but i’ll follow through on this.
Yeah. That feels like that possible reasoning to me as well, though I would’ve expected any model running on the gpt-4 base to have a similar inference engine aside from the turbo maybe, which should not have been referencable via gpt-4 call but who knows what happens during a possible testing in the backend tbf
I’ve just started getting this error for the first time today. It’s not happening often- perhaps one out of ever few hundred calls- but it’s running in an iterative environment so it seems like most calls succeed and calls are identical except for the prompt.