End of the NAND layer race: innovation between vectors

NAND is an essential part of the future of electronics. It’s everywhere, boosting storage capacity, performance and energy efficiency in everything from data center servers to the smallest mobile devices, such as phones, drones, cameras and other wearable devices.

As these systems and electronic devices add more features and perform more complex tasks such as AI, data storage needs will continue to grow, making NAND flash memory a critical part of future innovations.

As a result, the race is on to build higher capacity NAND, with better performance and lower power. Many people believe that a higher number of layers is the only way forward. But the truth is that there are many vectors for NAND innovation and higher layer counts are not the only way to increase NAND flash bits and storage capacity.

This new era of NAND brings a period of change, where the layer-centric race is behind us. The emphasis shifts to strategically timing the introduction of new, more sustainable nodes optimized for specific use cases and applications. Not all applications need the newest node with the highest capacity or performance. Making each layer denser, rather than simply stacking more layers, improves energy efficiency, performance and capacity while managing costs for specific customer needs.

Dr. Luca Fasoli

Senior Vice President of Development Engineering at Western Digital.

Traditional vertical scaling

The ‘layer race’ is the idea that more layers mean more bit density and capacity, which leads to a cost advantage – therefore the NAND with the highest number of layers should be the best. But with 3D NAND it’s no longer that simple.

Scaling NAND is similar to adding capacity in a hotel. Simply adding more floors may seem like a good idea, but you have to remember that building up leads to an increase in operational costs and complexity, including the costs of buying and moving equipment, building floors, etc. At some point the return on adding additional floors decreases. Intuitively, the proportional cost savings achieved by adding ten floors to a hundred-story building is better than adding the same number of floors to a five-hundred-story building. But the capital required to add the extra may be higher to build those extra 10 floors on top of a five hundred story building.

By making each floor denser, reducing the size of rooms and using space more effectively, the same occupancy rate can be achieved in a much more efficient and cost-effective way.

The same logic applies to NAND architecture. Simply adding NAND layers on top of each other may not be the only way to build more bits or capacity. Like the floors of a hotel, it becomes more expensive and difficult to build usable NAND as the number of layers increases. For example, stacking layers leads to longer processing time and additional capital for the advanced tools needed to ensure we can reliably produce high-quality NAND dies.

Scale smarter by using multiple vectors

Although the number of layers will continue to grow, it is no longer the main driver of innovation. Instead, innovation spans multiple vectors and there are other ways to scale NAND architecture besides vertical scaling, including lateral, logical, and architectural scaling methods.

Lateral scaling works by packing each individual memory layer while removing some of the redundant support structures. It’s like squeezing more rooms onto the same floor of a hotel room or reducing the number of stairs and elevators in a building. For example, if you start scaling laterally, you can optimize the available space before adding a new layer. This phased approach is much more efficient, saving costs and reducing risks. It also allows customers to reach a specific capacity point at the right time, with consistent supply and quality. And when the decision is made to add more layers, the benefit is multiplied by the increased efficiency of the added layers.

Logic scaling increases the number of logical bits that can be stored on a physical device. In the hotel room scenario, this would amount to squeezing more guests into the same hotel room without causing disruptions.

Finally, architectural scaling optimizes the way circuits support memory arrays, such as positioning circuits next to the array, tucking them beneath it, or perhaps deploying them on a separate wafer. In a hotel this could be the parking space for the required guests – on the side of the building, under or above the building (obviously with a cost-effective way to transport cars).

A combination of all four

An approach that uses a combination of all four of these scale vectors is a much smarter way to add NAND bit growth without sacrificing performance and energy efficiency for a wide variety of applications and devices. And it has the added benefit of optimizing cost reduction from node to node and minimizing the capital required for transitions.

And while NAND technology is complex, the manufacturing processes that create viable NAND nodes and ultimately products are even more so. These conditions are exacerbated by supply and demand dynamics in an emerging era where new applications, especially AI, will significantly increase the need for both compute- and storage-intensive flash-based solutions.

For example, this AI Data Cycle framework shows the virtuous cycle in which storage fuels AI models, and AI demands more storage in return. This AI data cycle will be a major incremental growth driver for the storage industry.

Performance, power and capacity

Performance, power and capacity are important considerations at each stage, as each stage requires something different. While the initial stages require massive capacity to hold as much data as possible for model training, speed and performance can be the most important factors as the data moves through the cycle. And power is increasingly becoming a crucial factor in any AI application.

In this new era of NAND, NAND node migration paths must also be based on customer needs, not the one-size-fits-all approach of the past.

The different needs of different customers are starting to bifurcate and the role of NAND suppliers in addressing these needs is becoming much more interesting. Ultimately, what a customer builds will determine how the flash inside it should work: how big it should be, how much capacity it will have, and how much power it will consume. It’s not about how many layers the product has. Focusing on the features that matter most to customers – performance, capacity and power – is the winning strategy.

We list the best SSDs: top solid-state drives for your PC.

This article was produced as part of Ny BreakingPro’s Expert Insights channel, where we profile the best and brightest minds in today’s technology industry. The views expressed here are those of the author and are not necessarily those of Ny BreakingPro or Future plc. If you are interested in contributing, you can read more here: https://www.techradar.com/news/submit-your-story-to-techradar-pro

Related Post