HomeTechnology

Intel’s new Gaudi 3 accelerators massively undercut Nvidia GPUs as AI race heats up

Views: 60
0 0
Read Time:2 Minute, 38 Second
Intel’s new Gaudi 3 accelerators massively undercut Nvidia GPUs as AI race heats up
Intel promises comparable performance for much less money.

In the fierce competition for dominance in AI hardware, Intel sent down the gauntlet to Nvidia. CEO Pat Gelsinger revealed the price of Intel’s upcoming Gaudi 2 and Gaudi 3 AI accelerator processors this week at Computex, and the figures are startling.

Usually, prices for such products are kept under wraps, but Intel has broken with convention and released some actual numbers. When purchased separately, the flagship Gaudi 3 accelerator will retail for about $15,000, which is 50% less expensive than Nvidia’s rival H100 data center GPU.

Even though it is less powerful, the Gaudi 2 offers a significant price reduction over Nvidia. System suppliers will be able to purchase an 8-chip Gaudi 2 accelerator package for $65,000. According to Intel, that’s only one-third the cost of equivalent configurations from Nvidia and other rivals.

The same eight-accelerator kit configuration is $125,000 more expensive for the Gaudi 3. According to Intel, at that high-end performance category, it is two-thirds less expensive than competing products.

Gaudi 3 pricing should be understood in the light of Nvidia’s recently released Blackwell B100 GPU, which sells for about $30,000. In the meantime, the B200, a powerful Blackwell CPU+GPU combo, retails for about $70,000.

Pricing is, of course, only one factor in the equation. Both software ecosystem and performance are important factors to take into account. Regarding that, Intel asserts that the Gaudi 3 performs as well as or better than Nvidia’s H100 in a number of significant workloads related to AI training and inference.

According to Intel-cited benchmarks, in big 8,192-chip clusters, the Gaudi 3 can achieve up to 40% quicker training times than the H100. According to the business, even a smaller 64-chip Gaudi 3 arrangement gives 15% greater throughput than the H100 on the well-liked LLaMA 2 language model. For AI inference, Intel claims a 2x speed advantage over the H100 on models like LLaMA and Mistral.

Advertisements

Though the Gaudi chips make use of open standards like Ethernet to facilitate deployment, they are not optimized for Nvidia’s widely used CUDA platform, which is what the majority of AI applications currently uses. It might be difficult to persuade businesses to rewrite their code for Gaudi.

Intel claims to have partnered with at least ten significant server suppliers to promote adoption, including new Gaudi 3 partners like Asus, Foxconn, Gigabyte, Inventec, Quanta, and Wistron. Well-known brands including Supermicro, Dell, HPE, and Lenovo are also included.

Nevertheless, Nvidia is a serious player in the data center industry. They held a 73 percent market share in the last quarter of 2023, and their share has been steadily increasing, undermining the positions held by AMD and Intel. Not much has changed in the consumer GPU market, where Nvidia holds an 88 percent market share.

Though Intel still has a long way to go, these stark price disparities could assist.

Group Media Publication
Construction, Infrastructure and Mining   
General News Platforms – IHTLive.com
Entertainment News Platforms – https://anyflix.in/
Legal and Laws News Platforms – https://legalmatters.in/
Podcast Platforms – https://anyfm.in/

Happy
Happy
0 %
Sad
Sad
0 %
Excited
Excited
0 %
Sleepy
Sleepy
0 %
Angry
Angry
0 %
Surprise
Surprise
0 %

Average Rating

5 Star
0%
4 Star
0%
3 Star
0%
2 Star
0%
1 Star
0%