If I upload huge excel files and many websites, can I have a detailed and comprehensive data analysis?

For example, I upload a huge Excel sheet and many websites. I want to compare brands, sales rates and current prices on a product basis. Can GPT4 browse and scan data for me and compare the data I want? Can GPT4 give me the most suitable strategy suggestions? How accurate data do I get?

With the context window being limited to 8K tokens, you wont be able to share much massive files.
You would be better off narrowing down certain brands you wanna compare or a specific product you want to compare

So, can I have data analysis done if I upload small excel files and a website?

Yes, if you provide with clear instruction and relevant data, it should be able to do something for you. Though I will say that it is still not that good with numbers and calculation and it would be prudent to double check every output it gives you, though that would be besides the point

Yes, it is the same scenario when working with large amounts of text data. You can chunk the data by splitting it up into suitable sizes for the model to process and then need to make sure you are correctly carrying the intermediate results forward.
With spreadsheet data I would suggest to follow a more classic approach and pre-process the data before asking the model anything about it.
For example: “this is the spreadsheet data for product A and here is the content from another website about the same product”.
The advantage of this approach can be a cost and time reduction when running the model requests because you can control the input and maybe even output tokens more effectively.

1 Like