There’s a moment in every startup founder’s journey where you have to choose between money and values. Anthropic just had that moment in front of the entire world. They said no to a $200 million Pentagon contract because the Department of Defense wanted unrestricted control over their AI models, potentially including use in autonomous weapons and mass domestic surveillance.
Then OpenAI said yes. And now we’re watching ChatGPT uninstalls spike by 295%.
This isn’t just a corporate drama story. This is what happens when the defense industry meets the AI gold rush, and the stakes couldn’t be higher.
The Contract That Changed Everything
Let’s back up. The Pentagon was looking for a partner to develop AI capabilities that could strengthen U.S. military operations. That’s not inherently sinister. Every major power is working on military AI. But here’s where it gets messy: the DoD wanted what amounts to a blank check. Full control over how the AI could be used. No guardrails. No restrictions on autonomous weapons or surveillance applications.
Anthropic looked at that contract and said no thanks. They got designated a “supply-chain risk” for their trouble. That’s Pentagon speak for “you’re unreliable,” which is not exactly a badge of honor in federal contracting.
OpenAI, meanwhile, decided the $200 million was worth it. They accepted the deal. They probably ran the math and figured the upside outweighed the PR risk.
Except the PR risk turned out to be massive.
When Users Vote With Their Devices
Within days of the news breaking, ChatGPT uninstalls surged. We’re not talking about a small bump. A 295% spike is the kind of number that gets attention in San Francisco. People started voting with their feet, or more accurately, with their delete buttons.
This reveals something important about how technology companies are perceived right now. Users don’t just care about functionality anymore. They care about values. They care about how companies operate behind the scenes.
OpenAI’s bet was that the military contract represented strategic positioning and revenue. What they didn’t fully account for was that their user base might have different opinions about military AI applications. A lot of people, it turns out, do.
The Founder’s Dilemma Becomes Real
For any startup founder watching this play out, there’s a brutal lesson here. Federal contracts are lucrative. The kind of money that could accelerate your entire business trajectory. But they come with costs that aren’t always obvious upfront.
When Dario Amodei, Anthropic’s CEO, rejected the deal, he wasn’t just turning down money. He was making a statement about what kind of company Anthropic wants to be. Whether that statement sticks or whether it’s just good marketing is something history will judge. But in the moment, his team had to believe in it enough to walk away from nine figures.
OpenAI chose differently. That choice is now reshaping how millions of people think about the company.
The Question Nobody’s Really Answering
Here’s what keeps me up at night about this whole situation: how much access should any military have to advanced AI? Not just the U.S. military. Any military.
The Pentagon’s desire for unrestricted control isn’t crazy from their perspective. They’re thinking about national security. But unrestricted access to AI that could power autonomous weapons or enable mass surveillance? That’s the kind of thing that seems fine in a policy meeting until it’s actually deployed somewhere and people are dealing with the real-world consequences.
Anthropic essentially argued that there needed to be guardrails. OpenAI apparently decided those guardrails weren’t necessary or weren’t their responsibility to enforce.
This is what happens when you give military budgets to companies that usually care about quarterly earnings and user growth. The incentives don’t always line up with the public interest.
The real question isn’t whether military AI is coming. It is. The question is whether the people building it should have any say in how it’s used, or whether they should just take the check and move on.


