Wouldn’t it also be because of the length of page that is reading? I was working on a web scraping tool (before openai’s web search existed) and sometimes would come across pages that were 40,000 tokens long, that is 40,000 input tokens that would also need to be charged?