Technology

Altman calls Musk orbital data centers ridiculous

Vacuum cooling and radiation turn compute into heat problem, Earth permitting fights don’t repeal thermodynamics

Images

OpenAI CEO Sam Altman and SpaceX CEO Elon Musk have a long-running feud.
                            
                              Chip Somodevilla/Getty Images; Fabrice COFFRINI/AFP via Getty Images OpenAI CEO Sam Altman and SpaceX CEO Elon Musk have a long-running feud. Chip Somodevilla/Getty Images; Fabrice COFFRINI/AFP via Getty Images businessinsider.com

Sam Altman, OpenAI’s CEO, used a stage in New Delhi to pour cold water on Elon Musk’s latest techno-utopian pitch: “a constellation of a million satellites that operate as orbital data centers.” According to Business Insider, Altman called the idea “ridiculous” in today’s “current landscape,” conceding only that it “could make sense someday.”

Altman’s remark lands as more than feud content because “orbital compute” has become a convenient PR escape hatch from the very terrestrial mess AI firms are creating: power-hungry server campuses that strain grids, water supplies, and local politics. Business Insider notes that US communities are increasingly resisting new data centers after a boom of approvals. If the ground is politically contested, why not move the racks into the sky?

The problem is that a data center is, in practice, a heat-management business with some compute attached. On Earth, operators rely on convection, evaporative cooling, and cheap maintenance logistics. In orbit, convection disappears; you dump heat through radiation. That means vast radiator area, careful thermal design, and a constant tradeoff between compute density and the ability to shed waste heat. A “rack” in vacuum becomes a pricey heat fin with a software problem.

Then there’s radiation. Commodity silicon is not designed for constant exposure to energetic particles. Single-event upsets (bit flips), latch-ups, and cumulative damage shift the reliability math from “swap failed drives” to “design for failure as a baseline.” You can harden hardware—at a cost in performance, availability, and vendor choice—or accept frequent faults and build heavy redundancy. Either way, the bill shows up as mass, power, and complexity.

Maintenance is the other unglamorous killer. Altman pointed to the difficulty of repairing a chip in space. That’s not a rhetorical flourish: modern data centers depend on rapid, routine replacement of components, firmware updates, and human intervention when automation fails. Even if you imagine robotic servicing, you’ve just reinvented an orbital logistics and industrial base to support what is, on Earth, essentially warehouse IT.

Latency and networking don’t rescue the concept either. Low Earth orbit helps compared with deep space, but real-time platforms still need predictable connectivity, and “edge” compute only matters if it’s close to users or sensors. For general-purpose AI training and inference—massive, centralized, bandwidth-hungry workloads—the constraint is energy and cooling, not the romance of altitude.

Google’s reported “Project Suncatcher,” cited by Business Insider, suggests the idea is spreading. Space does offer abundant solar flux, but turning sunlight into stable, high-duty-cycle power for compute while exporting heat is not a magic trick; it’s engineering with brutal constraints.

If you can’t get permits for a data hall in Texas, launching one is an expensive way to discover that physics is a stricter regulator than any zoning board.