Handoff Agents Using OpenRouter

I am using OpenRouter to use OpenAI SDK Agents. I am using handoff agents but I want to use a different model for the handoff agent for writing the code. I am passing OpenAIChatCompletionsModel in the model while defining the agent. I just want to confirm if this will work since the main agent has a different model. How can I view this on terminal to confirm which model is my handoff agent using?

Reference definition of my handoff agent

self.client = AsyncOpenAI(base_url=self.base_url, api_key=self.api_key, default_headers=extra_headers)

self.client_model = OpenAIChatCompletionsModel(model=self.model, openai_client=self.client)

handoff_agent = Agent(name="Handoff Agent", instructions=system_prompt, model=self.client_model, tools=[tool1, tool2, tool3])


1 Like

The best way is usually to see the logs in the traces dashboard.

You can look into the docs for configuration details, including a snippet on how to set an openai key just for traces while still using another provider, or choose a different trace provider like langsmith.

1 Like