The team of researchers from China used the Sunway supercomputer to train an AI model called ‘bagualu,’ which means “alchemist’s pot.” They presented their results at a virtual meeting of Principles and Practice of Parallel Programming 2022, an international conference hosted by the US-based Association for Computing Machinery (ACM).
They trained bagualu with a total of 174 trillion parameters, which according to SCMP rivals the number of synapses in the brain.
In truth, though the exact number of synapses in a brain is incredibly difficult to map, some estimates suggest the human brain contains up to 1,000 trillion synapses. That’s not to say 174 trillion parameters isn’t an incredibly impressive number when it comes to the field of artificial intelligence — last year, Google Brain was celebrated for developing an artificial intelligence language model with 1.6 trillion parameters.
Yeah, I always thought it was just over a quadrillion synapses in our meat-brains…