Samsung missed out on Nvidia’s most expensive AI card, but beats Micron with 36 GB of HBM3E memory. Could this new technology power the B100, the successor to the H200?

Samsung says it has developed the industry’s first 12-stack HBM3E 12H DRAM, surpassing Micron technology and potentially paving the way for the next generation of Nvidia’s AI cards.

The South Korean tech giant’s HBM3E 12H offers bandwidth up to 1,280 GB/s and an industry-leading capacity of 36 GB, which represents an improvement of more than 50% over the 8-stack HBM3 8H.

The 12-layer HBM3E 12H uses advanced non-conductive thermal compression film (TC NCF), allowing the 12-layer products to meet current HBM package requirements while maintaining the same height specification as 8-layer products. These improvements have resulted in a 20% increase in vertical density compared to Samsung’s HBM3 8H product.

The battle flares up

“The industry’s AI service providers increasingly require higher capacity HBM, and our new HBM3E 12H product is designed to meet that need,” said Yongcheol Bae, Executive Vice President of Memory Product Planning at Samsung Electronics. “This new memory solution is part of our commitment to developing core technologies for high-stack HBM and providing technology leadership for the high-capacity HBM market.”

Meanwhile, Micron has started mass production of its 24GB 8H HBM3E, which will be used in Nvidia’s latest H200 Tensor Core GPUs. Micron claims its HBM3E uses 30% less power than its competitors, making it ideal for generative AI applications.

Despite missing out on Nvidia’s most expensive AI card, Samsung’s 36GB HBM3E 12H memory outperforms Micron’s 24GB 8H HBM3E in terms of capacity and bandwidth. As AI applications continue to grow, Samsung’s 12H HBM3E will be an obvious choice for future systems that require more memory, such as Nvidia’s B100 Blackwell AI powerhouse expected to hit the market late this year.

Samsung has already started testing its 36GB HBM3E 12H to customers, with mass production expected to start in the first half of this year. Micron will ship its 24GB 8H HBM3E in the second quarter of 2024. Competition between the two technology giants in the HBM market is expected to increase as demand for high-capacity memory solutions continues to rise in the AI ​​era.

More from Ny Breaking

Related Post