This week, a research team at Chinese company Huawei quietly detailed what might be the Chinese-language equivalent of GPT-3. Called PanGu-Alpha (stylized PanGu-α), the 750-gigabyte model contains up to 200 billion parameters — 25 million more than GPT-3 — and was trained on 1.1 terabytes of Chinese-language ebooks, encyclopedias, news, social media, and web pages. [Source]
up to 200 billion parameters — 25 million more than GPT-3 —
Wait, did GPT-3 write this article or do humans make errors too!
In all seriousness, an interesting development. I wonder about the CCP’s thoughts on potentially dangerous output. They’ll likely not make it available to their citizens…