DeepResearch API won't work on Python but works fine on playground

I’m using exactly the same code from the playground, but it won’t work in Python. I’m pretty sure my OpenAI package is up to date. Does anyone know why?

from openai import OpenAI
import time
from dotenv import load_dotenv
import os

load_dotenv()
client = OpenAI(api_key=os.getenv(“OPENAI_API_KEY”))

response = client.responses.create(
model=“o4-mini-deep-research”,
input=[
{
“role”: “user”,
“content”: [
{
“type”: “input_text”,
“text”: “Please help me make a travel guide for Tokyo, which will last for 8 days and 7 nights.”
}
]
}
],
service_tier = “flex”,
text={},
reasoning={
“summary”: “auto”
},
tools=[
{
“type”: “web_search_preview”,
“user_location”: {
“type”: “approximate”
},
“search_context_size”: “medium”
}
],
store=True,
background=True,
)

while response.status in {“queued”, “in_progress”}:
print(f"Current status: {response.status}")
time.sleep(2)
response = client.responses.retrieve(response.id)

print(f"Final status: {response.status}\nOutput:\n{response.output_text}")
print(response)
————————————————

Here’s the output:

Current status: queued
Final status: failed
Output:

Response(id=‘resp_685debc3712081988c11b28131eb1c4f02e5d62ebb6cb07e’, created_at=1750985667.0, error=ResponseError(code=‘unknown’, message=‘There was an issue with your request. Please check your inputs and try again’), incomplete_details=None, instructions=None, metadata={}, model=‘o4-mini-deep-research-2025-06-26’, object=‘response’, output=, parallel_tool_calls=True, temperature=1.0, tool_choice=‘auto’, tools=[WebSearchTool(type=‘web_search_preview’, search_context_size=‘medium’, user_location=None)], top_p=1.0, background=True, max_output_tokens=None, max_tool_calls=225, previous_response_id=None, prompt=None, reasoning=Reasoning(effort=‘medium’, generate_summary=None, summary=‘detailed’), service_tier=‘flex’, status=‘failed’, text=ResponseTextConfig(format=ResponseFormatText(type=‘text’)), top_logprobs=0, truncation=‘disabled’, usage=None, user=None, store=True)

Ok I see what happened.
The service_tier=flex is still not supported