OpenAI winds down Sora after heavy compute burn
Wall Street Journal data shows one million dollars per day costs and user base halving, GPU allocation shifts from viral video to enterprise products
Images
OpenAI's Sora burned a million dollars a day while losing half its users in record time
the-decoder.com
OpenAI’s consumer-facing Sora app for generative video cost roughly $1 million per day to run while its user base halved shortly after launch, according to The Decoder citing reporting by The Wall Street Journal. The app reportedly reached about one million users, fell to around 500,000, and did not recover. OpenAI is now shutting Sora down in April, with the API scheduled to follow in September.
The numbers underline a basic constraint in the current AI boom: video generation is not just harder than text, it is ruinously expensive at scale. Even a popular app can become a financial sink if each minute of output consumes scarce GPU time that could be sold elsewhere at better margins. The Decoder reports OpenAI ultimately redirected limited compute capacity toward coding tools, enterprise products, and “agent” systems—areas where customers are more willing to pay recurring fees and where outputs are easier to integrate into existing workflows.
Sora’s decline also shows that demand is not the same as willingness to fund infrastructure. A burst of novelty usage can generate impressive sign-up charts while producing low-value content—“cheap engagement videos,” in the Journal’s description—whose business value is hard to capture. If the product is priced low to keep it viral, it cannot cover its compute bill; if priced high enough to cover costs, the casual users disappear.
The story also points to internal brand and legal considerations. The Decoder says OpenAI faced copyright issues and worried that a flood of low-quality clips could damage the company’s reputation, turning the app into a liability rather than a showcase. Development costs rose as well: the Journal reporting cited by The Decoder says OpenAI cancelled training runs for new video models entirely, a notable signal in an industry that usually treats more training as the default response.
Competitive pressure appears to have accelerated the decision. The Decoder describes “increasing competitive pressure from Anthropic” as part of the backdrop for reallocating compute. Whether or not Anthropic is the direct cause, the mechanism is clear: when GPU supply and power contracts are the choke point, product lines compete internally for capacity, and the winners are the ones with clearer monetisation paths.
OpenAI is not abandoning the underlying research. The Decoder reports the Sora team will shift toward “world models” for robotics, a domain where simulation and physical interaction can justify heavy compute spending and where outputs are less exposed to consumer taste cycles.
Sora’s shutdown schedule is concrete: the app ends in April and the API in September.