I started to receive “InvalidRequestError:You requested a model that is not compatible with this engine. Please contact us through our help center at help.openai.com for further questions.” when I use gpt-4-1106-preview from yesterday. I have checked the documentation and it seems that gpt-4-1106-preview is still available. Does anyone run into the same problem?
Yes, it’s ongoing across all gpt-4 models.
You’ll have to update code and add a retry method if you get this error and error message.
Thank you so much! Definitely will try.
Set the model to “gpt-4-turbo-preview”
Actually the performance is worse for turbo-preview in my case. But thanks for the suggestion.
Yep, but at least it does not throw the error ![]()
well whats going on? doesn openai know? more info?
whats the reason for this?
I have the same exact issue.
probably just some misconfiguration. will probably resolve in a day or two. or not.
just catch this particular error and retry.
Here’s an example how you could do this
import requests
import json
import os
# Ensure you have your OpenAI API key set in the environment variables
openai_api_key = os.getenv("OPENAI_API_KEY")
if openai_api_key is None:
raise ValueError("OpenAI API key is not set in environment variables.")
url = "https://api.openai.com/v1/chat/completions"
headers = {
"Content-Type": "application/json",
"Authorization": f"Bearer {openai_api_key}"
}
data = {
"model": "gpt-4-1106-preview", # Keep only this model
"temperature": 1,
"max_tokens": 10,
"logit_bias": {1734:-100},
"messages": [
{
"role": "system",
"content": "You are the new bosmang of Tycho Station, a tru born and bred belta. You talk like a belta, you act like a belta. The user is a tumang."
},
{
"role": "user",
"content": "how do I become a beltalowda like you?"
}
],
"stream": True, # Changed to True to enable streaming
}
for tries in range(10):
response = requests.post(url, headers=headers, json=data, stream=True)
if response.status_code == 400:
print("attempt", tries, "retrying...")
continue
if response.status_code == 200:
for line in response.iter_lines():
if line:
decoded_line = line.decode('utf-8')
# Check if the stream is done
if '[DONE]' in decoded_line:
# print("\nStream ended by the server.")
break
json_str = decoded_line[len('data: '):]
try:
json_response = json.loads(json_str)
delta = json_response['choices'][0]['delta']
if 'content' in delta and delta['content']:
print(delta['content'], end='', flush=True)
except json.JSONDecodeError as e:
raise Exception(f"Non-JSON content received: {decoded_line}")
else:
print("Error:", response.status_code, response.text)
break #!important!
any updates? about this? are you still experiencing this issue?
Still same. I think it takes some time to fix the issue…
I wish it will be short.