Introducing the Responses API

My question is , will we be able to use other models, like openrouter llms and just change the base url… The completions sdk is flexible with this… Its probably not up to openai though but other llm services if they want to use a different sdk. Either way, its the main reson for sticking to chat.completion, for me anyway