OpenAI turns down deals due to compute shortage
CFO Sarah Friar says demand outpaces capacity, AI boom starts to look like power and hardware rationing
Images
OpenAI is turning away business because it cannot get enough computing capacity, according to comments by chief financial officer Sarah Friar reported by Business Insider. Friar said demand for OpenAI’s models is outstripping available “compute” in 2026, forcing “tough trade-offs” and a pullback from some projects as resources are redirected toward core products.
That is a notable shift in what has been marketed as a software-driven boom. When a leading model provider says it is rationing access, the binding constraint is no longer the cleverness of the code but the availability of chips, data-centre space, power, cooling and the contracting machinery that secures them. In practice, scarcity tends to be allocated through mechanisms that look less like app-store pricing and more like industrial procurement: longer-term commitments, prepayments, bundled cloud contracts, and prioritisation of customers who can absorb volatility.
The immediate consequence is that “AI demand” becomes a competition for physical inputs. Large buyers can lock up capacity by signing multi-year deals or financing build-outs; smaller firms are left with spot availability and throttled service tiers. When a provider pauses or delays a product such as Sora to protect flagship services, it is also signalling that internal compute budgets are being managed like factory output—one line runs only if another line slows.
This kind of constraint pushes companies toward vertical integration. If model output depends on guaranteed access to GPUs and electricity, then owning—or at least controlling—data centres and energy supply becomes a strategic advantage rather than an operational detail. That logic has already been visible across the sector: hyperscalers tie AI access to their cloud platforms, chipmakers shape roadmaps around a handful of anchor customers, and energy projects get justified as “AI infrastructure.” The end-state is an AI market where the most valuable asset is not a model checkpoint but a queue position at the power substation.
For investors, a compute crunch also changes how growth should be interpreted. Revenue can rise because pricing tightens and capacity is scarce, even as the underlying market expands faster than supply. For customers, it means “AI adoption” comes with a new line item: the cost of securing compute, not just paying per token.
Friar’s message was blunt: OpenAI has opportunities it cannot take because there is not enough compute to run them. In 2026, the most important feature of a frontier model may be whether it can be scheduled at all.