Beta
MemoryReading · ~3 min · 36 words deep

LPDDR5X

LPDDR5X is a memory technology used in AI accelerators.

LPDDR5X spec page
TL;DR

LPDDR5X is a memory technology used in AI accelerators.

Live · bandwidth comparison
Full memory page
Level 1

LPDDR5X (LPDDR5X) delivers ? GB/s per stack with ? GB capacity. Primary manufacturers: multiple vendors. Used in: AI accelerators.

Level 2

LPDDR5X is a memory technology for AI and HPC workloads. Per-stack bandwidth is ? GB/s with ? GB capacity. Main manufacturers include the major memory vendors. Released around recently. Used in a range of AI accelerators.

Level 3

LPDDR5X provides n/a GB/s per stack at n/a GB capacity. Commercial yield varies by manufacturer: SK Hynix, Samsung, Micron. Supply allocation follows long-term agreements with major GPU and accelerator vendors. Memory bandwidth is the dominant constraint for LLM inference: arithmetic intensity of transformer attention layers means memory-bound kernels gate throughput on all current accelerators.

The takeaway for you
If you are a
Researcher
  • ·LPDDR5X: ? GB/s per stack, ? GB capacity
  • ·Made by SK Hynix, Samsung, Micron
  • ·Used in modern AI chips
If you are a
Builder
  • ·More bandwidth = faster LLM inference on memory-bound workloads
  • ·Chips with newer memory (LPDDR5X) hit better throughput per dollar
  • ·Check which chips use LPDDR5X on /memory/lpddr5x
If you are a
Investor
  • ·LPDDR5X supply is tightly allocated · drives chip pricing power
  • ·SK Hynix / Samsung capture most of the margin
  • ·Watch memory shortage news as a leading indicator of GPU pricing
If you are a
Curious · Normie
  • ·LPDDR5X is ultra-fast memory for AI chips
  • ·Without it, models can't feed data to the GPU fast enough
  • ·Limited supply is part of why AI hardware is expensive
Gecko's take

LPDDR5X supply allocation is the hidden variable behind every GPU shortage. When SK Hynix sneezes, NVIDIA catches cold.

LPDDR5X is a high-bandwidth memory technology used in AI accelerators.