…[SNIP]… LG Group’s AI research and development unit LG AI Research will build computing infrastructure with 600 billion machine learning parameters. The new computing system, capable of making 95.7 quadrillion calculations per second, will be introduced during the second half of the year and followed by an AI model with trillion-unit parameters in the first half of next year. The number of parameters of the new AI system by the South Korean conglomerate is three times higher than that of GPT-3, which is currently the world’s largest autoregressive language model created by OpenAI. [Source]
Is the hardware cost coming down that fast or have they made breakthroughs in other areas?
ETA: More on the money…
This is the second investment plan released by LG Group’s AI think tank launched in December last year. Back then, the organization announced that it would invest more than 200 billion won ($176 million) for human resources and in R&D activities. Monday’s $100 million-plan is not part of the previous announcement, a spokesman said, and will be solely dedicated to LG AI Research’s server infrastructure.[Source]