Beta
HBMVolume2024

High Bandwidth Memory 3e (Enhanced)

The dominant HBM generation for 2024-2026 AI training. Enhanced version of HBM3 with higher per-pin data rate (9.6 Gbps). Powers NVIDIA B200, GB200, GB300, AMD MI325X/MI355X, and hyperscaler ASICs. SK hynix leads production, Samsung struggling with yields, Micron validated for NVIDIA.

Bandwidth
1180 GB/s
Interface
1024-bit
Max height
12-hi
Capacity/stack
24-36 GB
Process
1b nm DRAM
Voltage
1.1V
AI chips using
8
Suppliers
3
AI demand
90%
$/GB
$25

3 companies producing HBM3e

SH
SK hynixVolume
Market share
53%

Industry-leading yields, 8-hi and 12-hi volume

S
SamsungRamping
Market share
30%

Yield issues on 8-hi stacking, catching up

M
MicronVolume
Market share
17%

8-hi validated by NVIDIA, 12-hi sampling

Severe shortage · demand exceeds supply through 2026

AI90%
Networking5%
HPC5%
Sector detail
AI training · 65%
NVIDIA Blackwell + AMD CDNA4 consume majority of output
AI inference · 20%
Growing as reasoning models require more memory
HPC / scientific · 10%
National labs, weather modeling, molecular simulation
Networking · 5%
High-end switching ASICs (Broadcom Memory)
Primary buyers
NVIDIA logoNVIDIAAMD logoAMDGoogle logoGoogleAWS logoAWSMicrosoft logoMicrosoftMeta logoMeta

Average selling price · year-over-year trend

Price per GB
$25
YoY change
+45%
Trend
Rising

HBM3e ASP up 45% YoY driven by AI training demand

8 chips from /hardware cross-linked via memoryGenerationSlug