Whoo hoo! Yes!
Is it fair to say bottom line is: ‘no, AGI isn’t here yet, but if you are willing to work with it, GPT-4 is capable of far more than nay-sayers realize.’?
Haven’t read the whole thing yet, tl;dr, although it does look like a must read. The key open problem here, imho, is how to get past the context size limitation.
Sure, llms are a great aid for solving little short sub-problems, but, how do you support a long-term research project? For example: ‘research lithium battery chemistry, find an open problem, and write a credible Phd thesis proposal on it.’ (note proposal, not thesis).
Key issues are how to structure long-term activity and large amounts of highly structured and relationship-rich external data uncovered along the way. I don’t think embeddings and / or rag get you very far. But maybe, again ‘if you are willing to work with it’, we can get further than many suspect.