Large scale content updates

Hi all!

I was hoping someone could help with a “best practice” type question, and answer whether what I’m doing is overkill. I have an excel of articles from my website - it’s about 1000. I want to use GPT to analyse them and suggest improvements etc (there’s a load of instructions for this step). Right now, I’m doing this via API and sending one article at a time, getting the analysis back, storing it, and then on to the next one…

Does this execution make sense? Should I just be trying to do it directly in the user interface (so far it never seems to work well! But perhaps I’m doing something wrong?).

Thanks in advance on any and all advice!

I think this approach makes a lot of sense and is pretty straight forward, as long as you can fit each article into the context window.

I think the magic of this kind of analysis is just getting it to the point where the analysis is actionable and not just generic suggestions. This is where I think the assistants api would be super useful because you could upload a document with all of your article ids, urls, titles, etc. This way when it is looking at a single article, it could potentially suggest things like adding internal links, or maybe even splitting/merging content. You might also be able to leverage querying google to reference competitor pages, etc.

The other suggestion would be to consider trying something other than using excel. That is going to be pretty cumbersome and difficult to use. I personally think the best value out there for this kind of work is fibery.io. It has built-in simple ai capabilities, plus you can call any external api through the automation functionality. I believe the free level would cover all your needs.

1 Like

Thanks so much for the response, and confirming that the approach makes sense! The articles aren’t particularly large - so i don’t believe there’s an issue with the context window. Thought I should probably check that - or would the api provide a response saying that it’s too large?

On the assistant api part… this sounds incredibly interesting. Let me see if I understand the potential suggestions…

  • I can upload the entire database of content; could be a JSON, for example (rather than excel) so it has the context/can reference all content.
  • I could provide additional actions like google api access to look wider than my internal DB
  • then as a second step I essentially do what I originally suggested where I iterate on a per article basis to analyse in detail

Am I following?

Thanks so much for taking the time to reply! Great suggestions

1 Like

Sorry I didn’t reply directly! Message is above!

can you please teach me how to use API in gpt 4 while making GPTs. i want to connect to my google drive and need chatgpt to read the data from there. i have google drive API