- Stocks
Meta awaits shipments of new Nvidia chips
Do you want to know how to make money from this?
Register for free and get expert advice, access to a training course and webinars.
Key points:
- Meta Platforms, Facebook’s parent company, has struck a deal with Nvidia to supply a new type of artificial intelligence chip.
- The new Nvidia Blackwell B200 chips are expected to begin shipping in late 2024.
- Meta Platforms plans to use the new B200 chips to train its Llama models.
Meta Platforms, Facebook’s parent company, expects to receive the first shipments of a new type of artificial intelligence chip from Nvidia later this year.
Nvidia, a leader in developing GPUs needed to power much of the cutting-edge AI research, unveiled the new B200 “Blackwell” chip at its annual developer conference on Monday.
Features of the new chips
Nvidia, the chipmaker, said the B200 has 30 times better performance for tasks such as generating chatbot responses. However, the company did not provide details on how efficiently the chip can handle the large volumes of data needed to train these chatbots.
Nvidia Chief Financial Officer Colette Kress told financial analysts on Tuesday that the company “plans to bring the product to market later this year,” but noted that a significant increase in shipments of the new GPUs is not expected until 2025.
Why did Meta make a deal with Nvidia?
Meta Platforms, the social media giant, is one of Nvidia‘s largest clients. In the past, the company has purchased hundreds of thousands of previous-generation chips to support its advanced content recommendation systems and generative artificial intelligence products.
In January, Meta CEO Mark Zuckerberg said the company planned to have about 350,000 of the earlier H100 model chips in its inventory by the end of the year. Combined with other GPUs, the Meta will have the equivalent of about 600,000 H100s by then.
On Monday, Zuckerberg said Meta plans to use Blackwell’s new B200 chips to train its Llama models. The company is currently training the third generation of the model on two GPU clusters it announced last week, each containing about 24,000 H100 GPUs.
Do you want to know
How to make money from the news
Register for free and get:
- Expert consultation;
- Access to the training course;
- Opportunity to participate in webinars