Do you want your server to have access to more than 100,000 DIMM slots at once? This Korean startup claims its CXL 3.1-based technology can help you scale to over 100 PB of RAM, but it will cost almost $5 billion

Have you ever imagined that you could use up to 100 petabytes of RAM? Well, this startup could be the key to unlocking groundbreaking memory capabilities.

Korean fables startup Panmnesia unveiled what it described as the world’s first CXL-compatible AI cluster with 3.1 switches at the recent OCP Global Summit 2024.

According to Panmnesia, the solution has the potential to significantly improve the cost-effectiveness of AI data centers by leveraging Compute Express Link (CXL) technology.

Scalable – but expensive

In an announcement, the startup revealed that the CXL-compatible AI cluster will be integrated into its flagship products, the CXL 3.1 switch and CXL 3.1 IP, both of which support the connections between the CXL memory nodes and GPU nodes responsible for storing large data sets. and accelerating machine learning.

Essentially, this will allow companies to expand memory capacity by equipping additional memory and CXL devices without having to purchase expensive server components.

The cluster can also be scaled to the data center level, the company said, reducing overall costs. The solution also supports connectivity between different types of CXL devices and can connect hundreds of devices within one system.

The costs of such an undertaking could be unsustainable

While using 100 PB of RAM may seem like overkill, it’s not exactly out of the question in the age of increasingly difficult AI workloads.

In 2023, Samsung revealed that it planned to use its 32 GB DDR5 DRAM memory to create as much as 1 TB DRAM module. The motivation behind this move was to help deal with the ever-increasing AI workloads.

While Samsung has yet to release a development update, we do know that the largest RAM units Samsung has previously produced were 512GB in size.

These were first unveiled in 2021 and were intended for use in next-generation servers powered by high-end CPUs (at least by 2021 standards – including the AMD EPYC Genoa CPU and Intel Xeon Scalable ‘Sapphire Rapids’ processors.

However, this is where cost could be a major inhibiting factor in the Panmnesia cluster. Prices for similar products, such as the Dell 370-AHHL memory modules with 512GB this currently stands at just under $2,400.

By any standards, that would require significant investment from a company. If you were to use Samsung’s top-end 1TB DRAM module, the cost would simply skyrocket, as the expected price last year was around $15,000.

More from Ny Breaking

Related Post