Samsung’s new HBM3E memory technology hits 1.2TB/s — you can bet that it will power Nvidia’s next AI monster GPU, the GH200

Samsung’s latest memory technology has reached staggering speeds of 9.8 Gb/s – or 1.2 TB/s – meaning it’s more than 50% faster than its predecessor.

online pharmacy purchase doxycycline online with best prices today in the USA

The HBM3E memory standard, nicknamed Shinebolt, is the latest in a series of high-performance memory units that Samsung has developed for the era of cloud computing and increased resource demands.

Shinebolt is a successor to Icebolt, which is available in variants up to 32 GB and can reach speeds of up to 6.4 Gb/s. These chips are specially designed for use with the best GPUs in AI processing and LLMs, with the company ramping up production this year as the nascent industry gains momentum.

Powering the next generation of AI chips

HBM3E will inevitably find its way into the components developed by the likes of Nvidia, and by any suggestion it could find its way into the GH200, nicknamed Grace Hopper, in light of a recent deal it made.

High bandwidth memory (HBM) is much faster and more energy efficient than conventional RAM, and uses 3D stacking technology that allows the layers of chips to be stacked on top of each other.

Samsung’s HBM3E stacks layers higher than in previous iterations through the use of non-conductive film (NCF) technology, which eliminates gaps between layers in the chips. Thermal conductivity is maximized and can therefore ultimately achieve much higher speeds and efficiency.

The unit will power the next generation of AI applications, Samsung claims, as it will accelerate AI training and inference in data centers and improve total cost of ownership (TCO).

Even more exciting is the prospect that it will be included in Nvidia’s next-generation AI chip, the H200. The two companies signed an agreement in September under which Samsung would supply the chipmaker with HBM3 memory units Korea Economic Journalwith Samsung supplying around 30% of Nvidia’s memory by 2024.

Should this partnership continue, there is every possibility that HBM3E components will become part of this deal once they enter mass production.

online pharmacy purchase levaquin online with best prices today in the USA

More from TechRadar Pro

Related Post