How can I summarize a tons of articles and then sentiment it?

• (b) Use ChatGPT to create a verbal summary of the five most important news stories. Hint: You will have to tweak the prompt in order to convey the meaning of “most important”

I was assigned this project where I need to download tons of financial news/articles from financial site; I choose Alyen .
Then. I have to say to chatgpt to create a verbal summary of the 5 most important stories. The challenge here is to say to chatgpt what is important.

Then I have to do the sentiment, but my concern is this part I wrote to you.

So suppose I have 20k articles of financial topics and chat gpt has to summarize it through a Python Code I’ve to write. The important topics could be interest rates, inflation, ai intelligence and change of financial positions (investment banking…)

Can you give me some input? I’ll remember you when I become a programming boss (possible only if you help me)

Jokes apart, which are your ideas about it???

Without doing your homework for you:

  • You’ll likely not want to write summaries for 20000 articles that will not be employed. Instead the focus should be on scoring individual articles by criteria that you write.

  • You’ll likely not want to solely trust a single score given by an AI model, as they can be quite random even when using non-random sampling parameters. Logprob returns engineered correctly can give multiple weighted probabilities of potential scores.

  • Important to who? Embeddings database is an inexpensive way to find similarities, potentially discarding whole categories, and allowing many repeated types of searches without incurring more AI costs.

  • Then 20k to only five? Better have an intermediary step, perhaps where AI can pick the best of the best from their summaries.

Good luck - fortunately nobody is going to read 20k articles to tell you that your AI got it wrong!

1 Like

Thanks for the reply.
Have you any YouTube video/documentation that I can check? Because for me, not so experienced, these are precious tips but I’m not sure if I understood. Sorry

I don’t think its worth using ChatGPT until you have the 5 articles figured out. You only have so much in tokens…

You might be able to use selenium in python with your webdriver to search and get your articles URL (I use it to get articles to summarize in ChatGPT get the java site articles with selenium and then clean them up with beautifulsoup), and then go from there. Your prompt could drive the search.

Just use RagAPI to summarize the pdf and then ask from GPT to get the sentiment from the summary.

Don’t have a PDF. I’ve to download all the articles (15-20k) through their API and then say to chat gpt what are from me the "most important"criteria in order to make chat gpt summarize the 5 articles resulting. After that i have to do a sentiment of those 5

So you want Chat GPT to wade through 20,000 articles and find the best 5 according to your search values and then summarize them?

Maybe make a recursive type of search where you open Assistants within assistants using the API. I sorta do something like this when I scrape a webpage. I call the assistant to summarize with my original prompt inside the function so the GUI window I use looks clean. it then dumps back out of the function.

So your function would track your tokens, start searching your articles and keep track of the ones that fit your search criteria then call itself when the token limit came close to do it again. Store the results in a global variable or something.

I think there are better ways to do this, like a search engine for your own documents and just don’t use Chat GPT until its time for the summary…

With 20k articles, I would fine-tune either Babbage or Davinci to pull out your key data (including summary and sentiment).

So two fine-tuned models, maybe Davinci for summary, and Babbage for sentiment.

Skip ChatGPT or any workarounds, just use the API and fine-tunes.

Another alternative is prompting a foundation model (multi-shot). But it’s unclear what direction you should go and what quality you really need.

You can convert your articles in any form you want. If you don’t know how to do it, then there is no way you can complete the main task.

I’m totally into this issue too. I’m trying to find an AI that can summarize the content of articles for me. Obviously, I don’t have to read as much as the author, in my case it’s aprox 50-60 different article/academic papers, but doing it all by hand is just too much.
I need this for my university research. Obviously, I won’t use everything I’ll read in my actual work, but I don’t want to miss out on anything big. I still have enought time, I tried different AI and ChatGPT in particular, but summirize article one by one is too long, if I won’t anything I will have to do by myself or look for help somewhere) if I don’t find anything I will hope this guys will help https://ca.edubirdie.com/research-paper-writing-services coz, at the moment I’m so overhelmed with all these assignments, that I have no time for myself. Of course, I can do it myself, with the help of ChatGPT, but in order to do it right, I will have to spend a lot of time to write the instructions correctly (and I don’t have much time and I’m also quite lazy)

Isn’t this what GPTs was designed for? You compress the 60 articles into less than 20 file and upload with instructions to summarize.