Is it time for a GPT-3 Training Data Refresh?

That system is “pretrain a new model on a massive amount of data”.

This is an old topic. You can compare the start date of this topic to see you are joining a topic at a point in time even before text-davinci-001.

Current models are not just pretrained, a process taking months of computation, but also have a massive investment in tuning, all elaborating on the way that base model works, in order to turn it into a safe and intelligent product. It is thus not simply “train another AI and release it for experimentation onto a small niche of developers” as it might have been over two years ago.