Pentagon sets Friday deadline for Anthropic to drop AI ethics rules
Defense Secretary Pete Hegseth has given Anthropic until Friday to remove restrictions on military use of its AI, threatening to invoke the Defense Production Act. Anthropic CEO Dario Amodei is holding firm, refusing to allow autonomous weapons targeting or mass surveillance. The Pentagon is pursuing parallel deals with Google, xAI, and OpenAI.
Defense Secretary Pete Hegseth has given Anthropic until 5:01 p.m. Friday to remove restrictions on how its AI can be used by the U.S. military, or face contract termination and designation as a supply chain risk. In a Tuesday meeting with Anthropic CEO Dario Amodei, Hegseth warned the company to set aside its usage policy limits or have the Cold War-era Defense Production Act invoked against it -- a law that gives the federal government broad authority to compel private companies to act in the national interest.
Anthropic maintains two specific restrictions it refuses to lift: its AI cannot be used for mass domestic surveillance of U.S. citizens, and it cannot make autonomous targeting decisions in physical attacks without human oversight. These are published usage policies that Amodei reaffirmed directly to Hegseth on Tuesday. The Pentagon is not waiting passively—it is pursuing parallel military AI deals with Google, xAI, and OpenAI for autonomous drone swarms and cyberattack capabilities. xAI has already secured Pentagon approval for classified network deployment, creating competitive pressure on Anthropic. The standoff is significant because it is one of the first direct, documented cases of a U.S. government agency using legal coercion to force a major AI developer to weaken its own safety guardrails. If the Pentagon succeeds, it sets a precedent that national security priorities can override an AI company's self-imposed ethical limits. Legal experts warn the Defense Production Act threat is unprecedented and could trigger lawsuits.
Sources
- T2
- T2
Stay informed. The best AI coverage, delivered weekly.