Computer Express Link (CXL) improves the way CPUs and GPUs communicate with memory and accelerators, standardizing communication between devices, reducing latency and making systems faster and better able to process large amounts of data. This is especially important for applications that require fast data processing, such as AI.
At a recent press conference, Jangseok Choi, vice chairman of Samsung’s new business planning team, revealed that the company will continue with its plans to produce and supply CXL memory modules.
“We plan to mass produce 256GB DRAM supporting CXL 2.0 this year. We expect the CXL market to flourish in the second half of the year and grow explosively from 2028 onwards,” Choi told the media.
A decade in the making
Samsung predicts that the adoption of CXL technology will result in an eight to ten times increase in memory capacity per server, which translates into a substantial leap in computing power. CXL, “expands the highway connecting the CPU and memory chips from two to three lanes to more than eight lanes,” a Samsung official explained to The Korean Economic Daily.
Samsung’s CXL 2.0 DRAM, due out in May 2023, will support memory pooling, a memory management technique that links multiple CXL memory blocks on a server platform to form a pool. Hosts can dynamically allocate memory from the pool as needed, resulting in more efficient use of memory capacity and optimized resource allocation.
Both Micron and SK Hynix are developing CXL-based memory products, but “As the only memory maker on the board of the CXL consortium, Samsung is committed to further expanding the CXL ecosystem through partnerships with data center, server and chipset companies across the industry,” Choi said.
“Samsung has been working on the development and mass production of high-end CXL for more than a decade,” he added. “We test our products together with our partners for performance verification.”