Technology

Pentagon blacklists Anthropic as supply chain risk

Defense R&D chief cites fear AI access could be cut mid-crisis, vendor guardrails become a switch in someone else’s hand

Images

Pentagon R&D chief Emil Michael (right) said he was so alarmed about Anthropic that he brought his concerns to Defense Secretary Pete Hegseth.
                            
                              Win McNamee/Getty Images Pentagon R&D chief Emil Michael (right) said he was so alarmed about Anthropic that he brought his concerns to Defense Secretary Pete Hegseth. Win McNamee/Getty Images businessinsider.com

Anthropic becomes a Pentagon supply chain risk, Defense R&D chief says access could be cut at a decisive moment, AI guardrails turn into procurement leverage

Undersecretary of Defense for Research and Engineering Emil Michael said the Pentagon has formally designated Anthropic a supply chain risk after talks over the use of its AI models collapsed, according to Business Insider. In an interview on the “All-In Podcast”, Michael described “holy cow” moments in which he concluded the department could not rely on a vendor that might restrict access during a crisis.

The flashpoint, Michael said, was the idea that exceptions to Anthropic’s usage policies could be handled by calling CEO Dario Amodei if urgent circumstances arose. In Michael’s telling, that answer did not fit military operations where decisions are made on compressed timelines and communications are not routed through a private CEO. He framed the problem as operational: if a “guardrail” or refusal triggered mid-operation, commanders would inherit a failure mode they did not design and could not override.

Michael also described a second incident after a US raid in Venezuela to capture President Nicolás Maduro. He said an Anthropic executive contacted a Palantir executive to ask whether Anthropic models had been used in the operation. The Pentagon accesses Anthropic models through a government cloud run by Amazon Web Services and then operated by Palantir, Michael said. Palantir officials, alarmed by the inquiry, escalated the issue to him.

In practice, the dispute highlights how “AI safety” is becoming a contract feature rather than a research agenda. The buyer wants predictable uptime, defined performance envelopes, and clear authority over how a system behaves under stress. The supplier wants public commitments that its models will not be used for fully autonomous weapons or domestic surveillance, and it wants to preserve the right to enforce those commitments in software.

That collision is not abstract. Modern air and missile defence already uses machine learning to sift sensor data and recommend responses faster than human staff can. Michael cited “Golden Dome” scenarios—President Donald Trump’s missile defence initiative—where AI might be asked to do more than triage alerts. Even if humans remain in the loop, the value of the system depends on it being available, predictable, and not subject to last-minute policy escalation.

Once a vendor’s internal policy can effectively throttle a government user, compliance becomes a form of control. A model provider can advertise strict restrictions as a safety credential to win contracts, while also using the same restrictions to limit liability when operations go wrong. The customer, meanwhile, can point to vendor-imposed guardrails as evidence of responsible use while still pursuing increasingly automated targeting and decision support.

Michael said the argument became a public relations fight, with Anthropic raising concerns that Pentagon terms would not sufficiently constrain misuse. He has previously called Amodei a “liar” with a “God-complex”, and the department has now moved from negotiation to exclusion.

The Pentagon’s position is that a critical supplier should not be able to renegotiate access during a crisis. Anthropic’s position is that a critical supplier should not be compelled to support uses it considers unacceptable.

The disagreement is now being settled the way procurement disputes often are: by deciding who gets to sell to the state.