[Feature-Request] Lower GPT-5 (full) API web-search pricing by integrating simpler models (prob. ChatGPT like)

Current gpt-5 web-search is expensive

Using the web-search tool costs “$10.00 / 1k calls + search content tokens billed at model rates” (source).

Using GPT-5 (full) via API means that all contents read are charged with GPT-5 pricing. A single request can cost 70 cents.

Solution: Use cheaper model for search result reading

Rough idea: If the search results would be fed to a cheaper model which gives the more expensive model the relevant (maybe abstractive) parts, would reduce the cost the cost massivley (the interaction could be implemented more complex than I outlined). This would make the web search tool much more attractive for GPT-5.

This should be probably be like how the ChatGPT application handles web searches.

The simpler model could be a specific internal model or maybe manually chosen: So, e.g. one would choose the main writing model (GPT-5) and a cheaper model (e.g. GPT-5 mini, GPT-5 nano, 4.1 nano, 4o-mini). This could be an alternative to the “default mode” (like it is now with using the same model).