‘Sold out’: Samsung’s archrival sells off precious HBM cargo, but you know who was its biggest customer – Nvidia and AMD can’t get enough high-bandwidth memory chips, but is anyone else?
SK Hynix, a key competitor to Samsung, says it has sold out its entire production of high-bandwidth stacked memory DRAMs by 2024. These chips are crucial for AI processors in data centers, but the company remains tight-lipped about its largest customers.
SK Hynix’s recently appointed vice president Kitae Kim, who is also head of HBM sales and marketing, confirmed the news in an interview on SK Hynix website.
“Proactively securing customers’ purchasing volumes and negotiating more favorable terms for our high-quality products are the foundation of semiconductor sales,” Kim said. “With excellent products in hand, it is a matter of speed. Our planned production volume of HBM this year is already sold out. Although 2024 has only just begun, we have already started preparing for 2025 to stay ahead of the market.”
‘Highly sought after’
If EE News Europe points out that the scarcity of HBM3 and HBM3E format chips could potentially hinder the growth of both the memory and logic sectors of the semiconductor industry this year.
“HBM is a revolutionary product that challenges the idea that semiconductor memory is only one part of an overall system. SK Hynix’s HBM in particular has excellent competitive strength,” said Kim.
“Our cutting-edge technology is highly sought after by global technology companies,” he added, making us wonder who his company’s biggest customers could be. Nvidia and AMD are known for being voracious in high-bandwidth memory chips, but there are other players in the market. competitive AI market that may be eager to buy up HBM stock to avoid being left in the background.
Interestingly, while SK Hynix cannot produce enough of its current HBM products to meet high demand, its main rivals in this area, Samsung and Micron, are now focusing on HBM3E. Micron has started production of its 24GB 8H HBM3E, which will be used in Nvidia’s latest H200 Tensor Core GPUs. At the same time, Samsung has started sampling its 36GB HBM3E 12H to customers, and this could very well be the memory used in Nvidia’s B100 B100 Blackwell AI powerhouse, which is expected to arrive late this year.
However, SK Hynix will not be left behind for long. It is expected to start production of its own 36GB HBM3E in the first half of this year, after upgrading its Wuxi factory in China.