Beta
SystemsReading · ~3 min · 48 words deep

DGX B200

NVIDIA DGX B200 is NVIDIA's node AI system.

DGX B200 on /systems
TL;DR

NVIDIA DGX B200 is NVIDIA's node AI system.

Level 1

The DGX B200 is a node AI compute system from NVIDIA. It packs ? accelerators and delivers ? PFLOPS of FP8 performance. Released in recent years.

Level 2

The DGX B200 is a rack-scale AI system. Configuration: ? accelerators, ? GB of HBM memory, ? PFLOPS FP8 aggregate. Manufactured by NVIDIA starting recent year. Deployed in frontier datacenters by hyperscalers and specialized AI clouds. BenchGecko tracks this system on /systems/nvidia-dgx-b200 with TCO, power, and deployment signals.

Level 3

DGX B200 specifications: n/a× accelerator, aggregate n/a PFLOPS FP8, n/a GB HBM, system cost approx $0.0M. System-level design choices (NVLink domain size, CPU-GPU ratio, network fabric topology) drive realizable throughput well beyond aggregate FLOPS. Real-world utilization typically 40-70% of peak due to memory bandwidth bottlenecks and network contention. TCO per PFLOP-year is the operative investment metric for large-scale buyers.

The takeaway for you
If you are a
Researcher
  • ·DGX B200: ?×chip · ? PFLOPS FP8
  • ·? GB HBM · released recent
  • ·NVIDIA · tracked on /systems/nvidia-dgx-b200
If you are a
Builder
  • ·DGX B200 is what hyperscalers buy to serve frontier models
  • ·Price ~$0.0M per unit at volume
  • ·Most usage is via API rental · raw system purchase is hyperscaler territory
If you are a
Investor
  • ·DGX B200 orders telegraph hyperscaler AI capex intent
  • ·System-level sales drive NVIDIA AI revenue
  • ·Watch NVL / rack-scale order book as a leading indicator
If you are a
Curious · Normie
  • ·DGX B200 is a rack-sized AI supercomputer
  • ·Packs dozens of chips in one unit
  • ·Costs millions · hyperscalers buy them by the thousand
Gecko's take

DGX B200 is a single-order-of-magnitude jump over the previous generation. Every new system reshuffles the cost-per-FLOPS leaderboard.

The DGX B200 is NVIDIA's rack-scale AI system.