OpenAI at ~$500B: not a chatbot company, a compute-and-power company.

If the sticker is real, the math must be real.

Receipt math: a private value near $500B priced at a mature 15–20x EBITDA implies ~$25–35B EBITDA inside 3–4 years. With sustainable gross margins in the 40–50% range once custom silicon and long-dated power contracts land, that points to ~$60–80B annual revenue. Anything materially below that and the multiple is carrying narrative, not cash.

What has to go right
• Chips: own or tightly controlled silicon and schedulers that cut inference cost per million tokens by an order of magnitude.
• Power: multi-gigawatt PPAs, steady latency under heavy load, and a path to cheap electrons at scale.
• Distribution: default placement at work and on devices so assistants graduate from chat to verifiable, liability-aware workflows.
• Regulation: high fixed safety and compliance costs that only a few players can afford.

What can break it
• Model parity compressing prices.
• Scarce GPUs and grid constraints keeping COGS high.
• Hyperscaler take-rates pinning margins.
• Agents that wow demos but fail enterprise audits.

Investor dashboard to watch
1. Inference cost per 1M tokens
2. Long-context latency SLOs
3. Enterprise ARR and net retention
4. Non-Microsoft compute share and chip tape-out milestones
5. Agent-driven transactions per DAU

The curve to justify ~$500B bends on three levers: chips, power, distribution. Nail them and this becomes the first true software-energy company. Miss them and gravity will test the multiple.