A supercomputer in China ran a ‘brain-scale’ AI model with 174 trillion parameters

The team of researchers from China used the Sunway supercomputer to train an AI model called ‘bagualu,’ which means “alchemist’s pot.” They presented their results at a virtual meeting of Principles and Practice of Parallel Programming 2022, an international conference hosted by the US-based Association for Computing Machinery (ACM).

They trained bagualu with a total of 174 trillion parameters, which according to SCMP rivals the number of synapses in the brain.

In truth, though the exact number of synapses in a brain is incredibly difficult to map, some estimates suggest the human brain contains up to 1,000 trillion synapses. That’s not to say 174 trillion parameters isn’t an incredibly impressive number when it comes to the field of artificial intelligence — last year, Google Brain was celebrated for developing an artificial intelligence language model with 1.6 trillion parameters.

[Source]

Yeah, I always thought it was just over a quadrillion synapses in our meat-brains…

The number of parameters is certainly an impressive feat. While it might not be scaled in a great ratio of data improvements to params, we can certainly learn from their techniques.

1 Like

Interesting.

Parameters and synapses are not the same thing.

It would be interesting to reach some conclusions about the comparison between the two and what it really means.

1 Like