Could RAM replace SSD? Well, not really but this plucky Japanese startup may come very close
A Japanese memory startup that has been lurking in the shadows for more than a decade has just emerged with a bang. Neo Semiconductor announced that it wants to produce DRAM chips that are 8x denser than the current crop – 16Gbit DRAM – using a technology called 3D X-DRAM.
As the name hints to, there will be layers of material, 230 in total for now, helping it produce a 128Gbit DRAM chip. In comparison, Samsung is aiming to launch a 32Gbit DDR5 DRAM chip in 2023 with 1TB memory modules on the horizon.
According to data revealed by the co-founder and CEO of the company, Andy Hsu, the first prototypes are expected to land next year should conversations to license the technology to DRAM manufacturers (Micron, Samsung Semi, SK Hynix, Kingston Technology) come to fruition.
Similarly to 3D NAND, 3D X-DRAM should enable memory densities to increase exponentially reaching up to 1Tb before 2024. In comparison, it took the DRAM industry more than a decade to move from 4Gb to 16Gb memory chips, a tiny improvement in comparison.
Cheaper and denser
Memory modules are relatively cheap at the lower end of the spectrum and become atrociously expensive at the very top end. A single 256GB DDR4 server RAM module, the maximum memory size on the market, retails for around $2,500 (or $10 per GB). In comparison, you can get 32GB RAM for less than $60 (or less than $2 per GB).
Neo’s technology solution could bring down the cost of memory as dramatically as 3D NAND did for solid state storage; imagine 256GB of RAM at $60. What makes it even more enticing is that it uses existing manufacturing techniques to achieve the layering, similar to 3D NAND.
This comes at a time where server manufacturers are finding it very difficult to add more memory slots to their server motherboards, forcing them to adopt exotic solutions like CXL expansion cards. 3D X-DRAM would also help solve this by boosting memory density.
Popular applications like machine learning or AI (think ChatGPT) that use LLM (Large language models) require access to very large pools of memory and this comes at a significant cost, both financially and in terms of power/latency.
The RAM drive
While I don’t expect it to reach end users anytime soon, it has the potential to change the memory-storage pyramid as Intel 3D XPoint (AKA Optane) failed to impose itself as a candidate tier between system memory and SSD.
For end users, given the fact that system storage has reached a plateau (most laptops come with 256GB onboard storage these days), one can expect to see devices that run on RAM only in a remote future. A paradigm shift that could change the overall computing landscape and the way modern operating systems operate.
Without the need for “slow” SSD and access to a single pool of ultra fast system memory, applications would be faster and potentially more secure. After all, VPN providers like NordVPN or ExpressVPN have rolled out RAM-only servers that load a new system image after every reboot.