Can you please incorporate the new api sdk’s in the models? I am tired of chat completions and temperature being added incorrectly on easy scripts and I have to manually fix the simplest script of having gpt write code to talk to itself incorrectly.
If I ask gpt to write an api, it should know how to write an api to openai, this is basic stuff.
I have to keep uploading the documentation as a google drive file and try 5 different times or do it myself for simple scripts.
Thought for 59s
You passed a Chat Completions–only parameter to the Responses API.
-
The run errors show:
TypeError: Responses.create() got an unexpected keyword argument 'response_format'. This is why most files failed. -
In the SDK doc you attached, Chat Completions is documented and requires
messageswithmodel, notinput, and is whereresponse_formatbelongs. -
You used the Responses endpoint with
inputand also sentresponse_format. That mix is invalid for your installed SDK. Use either:-
Responses API:
client.responses.create(model=..., input=[...])and dropresponse_format. Read text viaresp.output_textor theoutputarray. -
Chat Completions:
client.chat.completions.create(model=..., messages=[...], response_format=...)and read viaresp.choices[0].message.content. The doc you shared shows the Chat Completions contract.
-
Other mismatches to the SDK contract:
-
If you switch to Chat Completions, use
max_tokens, notmax_output_tokens(that’s Responses). -
If you stay on Responses, keep
inputnotmessages, and omitresponse_format. The error proves your SDK build rejects it.
Net: pick one API and match its parameters. Your current mix is the root cause.