Change baseURL on responses API?

Hey community,

I am using Responses API to build agent, wondering if it supports other models like Gemini?

I got 404 with code below:

client = OpenAI(
    api_key=os.environ\['GEMINI_API_KEY'\],
    base_url="https://generativelanguage.googleapis.com/v1beta/openai/"
)

response = client.responses.create(
    model="gemini-2.5-flash",
    input=\[
        {"role": "system", "content": "You are a helpful assistant."},
        {
            "role": "user",
            "content": "Explain to me how AI works"
        }
    \]
)

print(response.output)

error message:

Traceback (most recent call last): File "/Users/zhangligao/jujube-backend/agent/openai_agent.py", line 411, in <module> response = client.responses.create( ^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/zhangligao/miniconda3/envs/jujube-backend/lib/python3.12/site-packages/openai/resources/responses/responses.py", line 828, in create return self._post( ^^^^^^^^^^^ File "/Users/zhangligao/miniconda3/envs/jujube-backend/lib/python3.12/site-packages/openai/_base_client.py", line 1259, in post return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/zhangligao/miniconda3/envs/jujube-backend/lib/python3.12/site-packages/openai/_base_client.py", line 1047, in request raise self._make_status_error_from_response(err.response) from None openai.NotFoundError: Error code: 404

Hello.

Gemini has limited compatibility with the openai sdk, you can find more details on their documentation here.

As a general rule, other companies only support chat completions compatibility.

Another way to use other models would be through LiteLLM on agents sdk.

2 Likes