Beta
Learning path1 terms · ~3 min read

From Sand to Model

The AI supply chain in 7 terms · foundry, memory, chip, system, datacenter, provider, API.

Start · Inference
ConceptsChapter 1 of 1

What they are used for second (and permanently).

TL;DR

The process of running a trained model to generate predictions · every API call is inference.

Inference optimization is where the next 10× cost reduction lives. Every frontier lab is racing to ship the best serving stack.

Read full chapter
What you learned

By the end you understand the full stack from silicon to inference API · and why each layer sets pricing for the one above it.

Keep learning
Next path · 7 terms
The AI Bubble Explained

Seven terms that decode whether AI is overpriced, fairly priced, or criminally underpriced. Read in order.