Beta
MemoryReading · ~3 min · 74 words deep

3D NAND

3D NAND is vertically stacked flash memory · 300+ layers in 2026 · the storage layer for AI training datasets and model checkpoints.

TL;DR

3D NAND is vertically stacked flash memory · 300+ layers in 2026 · the storage layer for AI training datasets and model checkpoints.

Level 1

3D NAND stacks NAND flash memory cells vertically (like a 3D apartment building) instead of horizontally. This dramatically increases density per mm². Leading-edge nodes in 2026: Samsung 286-layer V9 NAND, SK Hynix 321-layer, Micron 232-layer G9. Used in all modern SSDs · critical for AI training data storage where tens of petabytes per training run are common.

Level 2

3D NAND density doubles every 2-3 years via layer-count increases and smaller cell sizes. PCIe 5.0 + NVMe 2.0 SSDs based on 3D NAND can read at 12-14 GB/s · sufficient for training data ingestion. Enterprise flash arrays (Pure Storage, NetApp, VAST) use 3D NAND to build petabyte-scale low-latency storage that feeds GPU training jobs. 3D NAND is distinct from HBM (AI's fast memory) · NAND is bulk storage, HBM is on-package working memory.

Level 3

2026 node leadership: Samsung (V9 at 286L), SK Hynix (4b stacked at 321L), Micron (G9 at 232L), Kioxia-WD (BiCS 8 at 218L). CMOS-under-Array (CUA) architecture pushes density further. Interface: Toggle DDR 5.0 and ONFi 5.1 at 3.6 Gbps. Enterprise pricing: $60-100/TB for high-endurance NAND · dropping 20-30% per year. AI training pipelines now routinely use 100+ PB for frontier model training.

The takeaway for you
If you are a
Researcher
  • ·Vertical NAND stacking · 300+ layers
  • ·Samsung V9, SK Hynix 321L, Micron G9 lead 2026
  • ·PCIe 5 / NVMe 2 · 12-14 GB/s per SSD
If you are a
Builder
  • ·Training data storage layer
  • ·Enterprise NVMe SSD arrays feed GPU clusters
  • ·100+ PB is standard for frontier model training
If you are a
Investor
  • ·Commodity vs HBM's premium · big volume, thin margin
  • ·Chinese producers (YMTC) catching up on layer count
  • ·Cyclical · oversupply in 2023 corrected by 2025 demand
If you are a
Curious · Normie
  • ·The memory in your SSD · stacked vertically to fit more
  • ·Used to store AI training data (lots of it)
  • ·Different from AI's fast HBM memory
Gecko's take

3D NAND is the unsung workhorse of AI training · stores the data, but the glamour goes to HBM.

NAND is bulk non-volatile storage (persistent). HBM is fast volatile memory directly on the AI chip. Complementary · not substitutes.