Digital Storage And Memory Projections For 2023, Part 2
That is the second in a set of three blogs about projections for digital storage and reminiscence for the next yr that we now have been doing for some time. Our first weblog targeted on the most recent developments and projections for magnetic recording (HDDs and magnetic tape). This weblog focuses on numerous sorts of solid-state reminiscence and storage and in addition DNA for storage. We are going to discuss concerning the newest developments in flash reminiscence, DRAM, NVMe together with computational storage, NVMe-oF and CXL and the way it will change the best way we do computing. As well as, we are going to discuss life after Optane and the way it will affect the expansion and improvement of non-volatile reminiscence applied sciences.
On the shut of 2022 demand for all storage and reminiscence applied sciences is down for client, shopper and server purposes. As well as, new DRAM and NAND manufacturing capability has been approaching line, resulting in extra product availability (apart for pandemic associated provide chain points). This has led to decrease costs for NAND flash and DRAM. In late November TrendForce stated that NAND flash income fell by 24% quarter over quarter in 3Q22.
Nonetheless, there are a lot of drivers for larger storage demand in 2023 and past and it will drive demand for storage units. Certainly, in mid-December 2022 SEMI reported that the worldwide chip business is projected to speculate greater than $500B in new factories by 2024 (though new semiconductor facility begins is projected to be down about 15% in 2023 in contrast with 2022), with a lot of this funding going into reminiscence chip factories.
NAND flash is the dominant major storage (storage for energetic in-process knowledge) in knowledge facilities and enterprise purposes and it’s typically the one storage utilized in client units comparable to good telephones and in most private computer systems. In bigger services energetic knowledge lives on SSDs with colder knowledge saved on exhausting disk drives (HDDs) and magnetic tape.
NAND flash is now obtainable with as much as 232 layers from Micron (for a client SSD) and with SK hynix’s Solidigm (previously Intel’s NAND flash enterprise) asserting that they have been making 238-layer 512Gb TLC NAND flash die with mass manufacturing scheduled for the primary half of 2023. In 2023 we count on 200+ layer NAND flash will achieve market share and even perhaps the primary NAND approaching 300 layers will likely be introduced.
Nonetheless, layer scaling isn’t the one strategy to get to larger reminiscence densities in NAND flash. On the 2022 FMS Kioxia and its associate WDC mentioned NAND scaling. The picture from Kioxia’s keynote presentation confirmed that lateral scaling (the scale of the cells and their spacing from one another is one other necessary attribute.
Along with lateral and vertical scaling there may be additionally structure scaling the place several types of semiconductor units are placed on prime of one another to avoid wasting house, together with bonding NAND cell die on prime of one another (as YMTC from China has been advocating).
Logical scaling refers to what number of bits are saved per cell with TLC (three bits per cell) and QLC (4 bits per cell) obtainable for a lot of purposes at the moment and PLC (5 bits per cell) potential sooner or later. Notice that this logical scaling trades off density for cell retention time and put on. WDC projected 500+ NAND flash layers by 2032 and stated that whole NAND manufacturing capability in 2021 was 765EB (exabytes) with over 2 ZB (zetabytes) projected for 2025.
Samsung can be trying to stack NAND die to create denser storage units. The chart under reveals a 32-die stack projection for 1PB units in 10 years (2032). NVIDIA is fascinated about utilizing PB-scale NAND units with their GPUs.
DRAM scaling can be persevering with. Samsung is the world’s largest DRAM producer they usually shared their DRAM roadmap on the Tech Day in 2022, proven under.
Forthcoming DRAM options from Samsung embody 32Gb DDR5 DRAM, 8.5Gbps LPDDR5X DRAM and 36Gbps GDDR7 DRAM. Samsung additionally talked about customized DRAM options comparable to HBM-PIM (excessive bandwidth memory-process in reminiscence), AXDIMM (Acceleration DIMM) and CXL.
There are numerous sorts of computational storage units and architectures obtainable. These embody DPU-based community computational storage comparable to merchandise obtainable from NVIDIA’s Mellanox in addition to SSDs with computation in-built from the key SSD corporations. Placing computation near or in storage units reduces knowledge motion (and thus lowering system energy necessities and latency) and it additionally off-loads the CPU from some computational duties. We mission numerous computational storage units turning into extra widespread in 2023 for numerous computational duties.
NVMe is now the dominant flash reminiscence interface and NVMe-oF (over material) the place the material is commonly ethernet, is turning into widespread in knowledge facilities. NVMe-oF is getting used to create swimming pools of solid-state storage that may be shared between CPUs and servers. This pooling and sharing of storage is known as disaggregation (breaking down the assorted elements of servers right into a shared pool), which software program can then use to create composable infrastructure supporting digital units or containers that may be created or destroyed as wanted. This type of pooling and composability is being prolonged to reminiscence with the Compute Specific Hyperlink (CXL) interconnect.
CXL gives for a switched community for reminiscence of assorted varieties and was pushed over the previous few years to permit Optane reminiscence for use together with DRAM in a shared reminiscence setting that assist several types of reminiscence with completely different prices and efficiency. The CXL 3.0 specification launch occurred in 2022 which might permit creating reminiscence swimming pools that may be shared between CPUs.
Intel launched the 3D XPoint know-how with its then associate, Micron, in 2015 and started to ship NVMe Optane merchandise (its commerce identify for 3D XPoint) in 2017 and DDR merchandise in 2018. After subsidizing this part change reminiscence product at whole value most likely near $10B Intel introduced in July 2022 that it might part out its Optane merchandise. Though present era Optane merchandise are nonetheless obtainable from Intel, there aren’t any Optane CXL merchandise. As an alternative, the SSD corporations want to present CXL-based product utilizing DRAM and NAND flash.
A number of of the key NAND flash corporations have been displaying NAND primarily based CXL units in 2022. Samsung launched what it referred to as a memory-semantic CXL SSD for AI/ML purposes. The machine consists of an inner DRAM cache with a bigger quantity of NAND flash reminiscence. Small IO’s are carried out with the DRAM and regular IO’s are carried out utilizing the NAND flash. Samsung stated that they’d a 20X enchancment in random learn efficiency in comparison with an everyday PCIe 4.0 SSD. SK hynix was displaying a CXL reminiscence expander on the 2022 FMS (as have been different corporations) in addition to what they referred to as an elastic CXL FPGA prototype. Additionally they stated that they’d samples of their DDR5-based CXL obtainable.
Marvell and different controller corporations are supporting CXL (in addition to NVMe) of their controllers as a way of attaining full knowledge heart composability that features reminiscence pooling in addition to storage pooling. The picture under reveals Marvell’s imaginative and prescient of how CXL can drive the event of prime of rack (TOR) switches which assist CXL and with whole disaggregation of compute, reminiscence and storage.
We count on that the primary techniques utilizing CXL for reminiscence growth for current CPUs will likely be obtainable beginning in 2023, with reminiscence pooling techniques supporting CXL model 3.0 obtainable by someday in 2024.
Though Optane reminiscence is winding down, numerous different non-volatile reminiscence applied sciences are ramping up in embedded utility, initially changing NOR flash and a few SRAM. These embody magnetic random-access reminiscence (MRAM) and numerous resistive RAM (RRAM) applied sciences. TSMC, Samsung and different foundries have produced numerous embedded units for wearable and automotive purposes, As well as, the growing reputation of chiplet know-how and the brand new common chiplet interconnect specific (UCIe) interface might drive discrete reminiscence chiplet demand, each for DRAM in addition to rising recollections.
Because the chart under from the Coughlin Associates and Goal Evaluation Rising Reminiscences Enter the Subsequent Section reportindicates, progress in each embedded and discrete non-volatile reminiscence know-how (represented by MRAM) may drive growing capability shipments and $44B of income by 2032.
Lastly, let’s take a fast have a look at the way forward for DNA primarily based storage. This has some relevance to solid-state reminiscence since a number of artificial DNA storage startups want to use silicon-based units as an necessary aspect of their approaches to storage. The picture under, from a chat by Karin Strauss from Microsoft Analysis reveals the fundamental steps in artificial DNA used for storage.
DNA storage continues to be within the laboratory and early prototype storage system stage, however there are rumors of some upcoming demonstrations with DNA storage in 2023.
Though 2022 ended with demand down for all sorts of storage and reminiscence applied sciences, we count on that demand will get better in 2023 to satisfy growing storage demand and to realize efficiencies from the most recent applied sciences that embody advances in NAND, DRAM, CXL and rising recollections. As well as, count on some vital advances and demonstrations with DNA storage in 2023.