Samsung missed out on Nvidia’s most expensive AI card, but beats Micron with 36 GB of HBM3E memory. Could this new technology power the B100, the successor to the H200?

Samsung says it has developed the industry’s first 12-stack HBM3E 12H DRAM, surpassing Micron technology and potentially paving the way for the next generation of Nvidia’s AI cards.

The South Korean tech giant’s HBM3E 12H offers bandwidth up to 1,280 GB/s and an industry-leading capacity of 36 GB, which represents an improvement of more than 50% over the 8-stack HBM3 8H.