Maia 100 Rack
Microsoft Maia 100 AI Accelerator Rack is Microsoft's rack AI system.
Microsoft Maia 100 AI Accelerator Rack is Microsoft's rack AI system.
Basic
The Maia 100 Rack is a rack AI compute system from Microsoft. It packs ? accelerators and delivers ? PFLOPS of FP8 performance. Released in recent years.
Deep
The Maia 100 Rack is a rack-scale AI system. Configuration: ? accelerators, ? GB of HBM memory, ? PFLOPS FP8 aggregate. Manufactured by Microsoft starting recent year. Deployed in frontier datacenters by hyperscalers and specialized AI clouds. BenchGecko tracks this system on /systems/microsoft-maia-100-rack with TCO, power, and deployment signals.
Expert
Maia 100 Rack specifications: n/a× accelerator, aggregate n/a PFLOPS FP8, n/a GB HBM, system cost approx $0.0M. System-level design choices (NVLink domain size, CPU-GPU ratio, network fabric topology) drive realizable throughput well beyond aggregate FLOPS. Real-world utilization typically 40-70% of peak due to memory bandwidth bottlenecks and network contention. TCO per PFLOP-year is the operative investment metric for large-scale buyers.
Depending on why you're here
- ·Maia 100 Rack: ?×chip · ? PFLOPS FP8
- ·? GB HBM · released recent
- ·Microsoft · tracked on /systems/microsoft-maia-100-rack
- ·Maia 100 Rack is what hyperscalers buy to serve frontier models
- ·Price ~$0.0M per unit at volume
- ·Most usage is via API rental · raw system purchase is hyperscaler territory
- ·Maia 100 Rack orders telegraph hyperscaler AI capex intent
- ·System-level sales drive Microsoft AI revenue
- ·Watch NVL / rack-scale order book as a leading indicator
- ·Maia 100 Rack is a rack-sized AI supercomputer
- ·Packs dozens of chips in one unit
- ·Costs millions · hyperscalers buy them by the thousand
Maia 100 Rack is a single-order-of-magnitude jump over the previous generation. Every new system reshuffles the cost-per-FLOPS leaderboard.