× Startups Business News Education Health Finance Technology Opinion Wealth Rankings Politics Leadership Sport Travels Careers Design Environment Energy Luxury Retail Lifestyle Automotives Photography International Press Release Article Entertainment
×

Facebook Parent Company Meta Is Building An AI Supercomputer

January 25, 2022

Facebook has long bet that artificial intelligence can help it with the difficult task of moderating posts from its billions of users. Now its parent company is taking a step that could move it closer to that elusive goal: building its first supercomputer.
Meta on Monday announced its new AI Research SuperCluster (RSC), a supercomputer meant to be used for AI research projects. The supercomputer, which has been in the works for two years, may eventually help Meta develop far more powerful AI software — potentially useful for tricky tasks such as spotting hate speech in posts.

"With RSC, we can more quickly train [AI] models that use multi-modal signals — for example, to understand the context of a post, including language, images, and tone," Shubho Sengupta, a software engineer at Meta (FB), told CNN Business.

Supercomputers, which feature many interconnected processors clumped into what are known as nodes, have become increasingly popular and powerful in recent years for AI research. The US Department of Energy's Summit, which is the fastest supercomputer in the United States and the second-fastest in the world, has been used to aid investigations into things like unfamiliar proteins. A few large tech companies, such as Microsoft and Nvidia, also have supercomputers for their own use.

In a blog post Monday, coauthored by Sengupta, Meta said that, as of January, its supercomputer includes 6,080 GPUs divided among 760 nodes, potentially making it one of the most powerful in the world. At its current size, Sengupta said, it's on par with the Perlmutter supercomputer at the National Energy Research Scientific Computing Center, which ranks fifth in the world. When completed later this year, however, Meta said it will have 16,000 GPUs, and the company expects it will be capable of nearly five exaflops of computing performance — that's 5 quintillion operations per second.

"When fully built out, the cluster will almost triple in size, which will, to our knowledge, make it the fastest AI supercomputer in the world," Sengupta said.

Meta said its researchers have begun using the supercomputer to train large models related to natural-language processing and computer vision, and that researchers will be able to use the supercomputer to "seamlessly analyze text, images, and video together" and come up with new augmented reality tools. Over time, Meta hopes it will enable the company to "build entirely new AI systems" that can do computationally difficult tasks such as real-time translations for a large group of people who all speak different languages.

The company said early tests showed the supercomputer could train large language models three times faster than the system it currently uses. That means an AI model that would take nine weeks to train with the existing system could be trained in three weeks with the supercomputer.

Meta hopes its new supercomputer will eventually be able to train AI models with trillions of parameters — there are just a few known existing AI models at that scale. It would also be many times larger than GPT-3, which is a large language model from OpenAI that can generate human-sounding text and is being used for applications like language-learning and tax software for freelancers.

Eventually, Meta said, the supercomputer will lead to technologies needed to build a so-called "metaverse" — a wide-ranging, interconnected virtual realm that people can walk around within, via digital avatars, and interact with others who are also there virtually. Facebook has positioned the metaverse as the future of the company, but today, only snippets of that vision have been realized.
































SOURCE: CNN
IMAGE SOURCE: Pixabay.com