Beta
LPDDRVolume2022

Low Power DDR5X

Low-power memory standard driving the on-device AI revolution. Powers Apple M-series, Qualcomm Snapdragon X, and Samsung Galaxy AI. Critical for on-device LLM inference where power efficiency matters more than raw bandwidth. Also used in some edge AI servers.

Bandwidth
34.1 GB/s
Interface
64-bit
Max height
1-hi
Capacity/stack
8-32 GB
Process
1b nm DRAM
Voltage
0.5V
AI chips using
0
Suppliers
3
AI demand
25%
$/GB
$3.5

3 companies producing LPDDR5X

S
SamsungVolume
Market share
48%

Apple M4 primary supplier

SH
SK hynixVolume
Market share
32%

Qualcomm primary supplier

M
MicronVolume
Market share
20%

Apple secondary supplier

Booming · on-device AI requires more memory per phone/laptop

AI25%
Consumer65%
Automotive10%
Sector detail
Mobile / smartphone · 45%
Flagship phones now require 12-16 GB for on-device AI
Laptop / PC · 35%
AI PCs (Apple M4, Snapdragon X) driving 32 GB standard
Automotive · 10%
In-vehicle infotainment and ADAS
Edge AI · 10%
Edge inference servers, robotics
Primary buyers
A
Apple
S
Samsung
Qualcomm logoQualcommGoogle logoGoogle

Average selling price · year-over-year trend

Price per GB
$3.5
YoY change
+12%
Trend
Rising

AI PC and AI phone trends driving LPDDR5X demand