Science

NASA-Boeing Starliner failure report spreads blame

Valves thrusters software and test gaps recur, partnership model turns accountability into a rounding error

Images

NASA's new head criticizes Boeing, NASA for botched Starliner flight that left 2 astronauts stuck NASA's new head criticizes Boeing, NASA for botched Starliner flight that left 2 astronauts stuck cbsnews.com

NASA and Boeing have now produced what American bureaucracy does best: a “lessons learned” document that spreads responsibility so evenly nobody is left holding it. The latest investigation into the problem-plagued Starliner test flight—whose propulsion and helium system issues left two astronauts stuck at the International Space Station for months—puts blame on both the contractor and the customer, according to the New York Times. CBS News reports NASA’s new administrator publicly rebuked Boeing and NASA alike, an unusually blunt admission that the failure was not a single bad valve but a system.

The proximate technical story is familiar to anyone who has watched complex spacecraft fail in slow motion: stuck or unreliable valves, misbehaving thrusters, software and control logic that did not gracefully handle off-nominal conditions, and a test regime that did not fully explore the corners of the operational envelope. The Times describes investigators pointing to gaps in verification and validation—less about whether components “worked” in isolation, more about whether the integrated vehicle behaved predictably under real flight timelines, thermal conditions, and fault cascades.

But the deeper indictment is organizational. NASA’s “commercial crew” model was sold as a way to buy transportation services rather than micromanage design. In practice, it can create a no-man’s-land of accountability: NASA retains certification authority and safety sign-off, Boeing owns the hardware, and both can plausibly argue that the other was responsible for catching systemic risk.

That ambiguity shows up precisely where failures like Starliner’s breed: requirements traceability, test coverage, and flight-readiness reviews. If requirements are written to be contractually defensible rather than operationally exhaustive, you can pass audits while still missing failure modes. If test plans are negotiated artifacts—constrained by schedule, cost, and what each side agrees is “in scope”—you get risk accumulation that later gets rebranded as “unexpected anomalies.”

Investigators, per the Times, faulted NASA for accepting insufficiently rigorous testing and oversight in key areas, and Boeing for engineering and program-management missteps. That is the predictable output of a partnership structure where “shared responsibility” often means “shared deniability.”

The administrator’s critique, reported by CBS, implicitly concedes that the agency’s procurement and certification incentives can drift: the same organization that wants competition, congressional goodwill, and schedule milestones is also supposed to be the hard-nosed safety regulator. When Starliner finally flies again, it will almost certainly do so with more paperwork, more process, and more meetings—because bureaucracies respond to engineering failures by adding governance layers. The failure mode the report describes is already governance: a system designed so no one owns the whole vehicle.