Top_p problem when running gpt-5.2 API

Hey,

I’m having some problems when I’m trying to run my prompt that is saved on “Chats“ and I have my script running with Powershell. But after changing the settings from gpt-5.1 to gpt-5.2 I also changed from reasoning high to xhigh. And I keep on getting this message, both in powershell and in the chat where I have my prompts saved on the dashboard.

Unsupported parameter: ‘top_p’ is not supported with this model.

But I don’t have anything with top_p in my prompt or script. But when I change to reasoning none, than it works, but no reasoning just makes my prompt weak and I can’t use it like that.

So help med please.

1 Like

Hi there!

This is a known issue and has been reported!

3 Likes

Okay thanks. I just thought there was anything wrong with my prompt, I have even troubleshooted it with gpt-5.2 extended thinking uploading all documents. But there was no problem. So, I was going crazy. haha

And it occurs when using agent.prompt. If agent.instructions is used, it works normally

1 Like

for a workaround try adding

temperature=1
top_p=1

There is no workaround needed any more. And what you say is bad advice. Reasoning models do not accept sampling parameters such as temperature or top_p.

The issue was that the “prompts” system, storing a preset for Responses to use (with a bad name) was sending an untolerated value “1” for top-p, which was not being accepted, because the internal default for gpt-5.2 is actually 0.98. This was a backend issue.

I can confirm the issue has not been resolved, the value was forced to be sent even if you remove it from the config file, I hope we wont need a workaround soon, but thats still an issue and the only way I was able to you it is by sending the top_p=1