Claude, Anthropic’s AI Model, Gains Popularity Following Dispute with U.S. Military

The AI model Claude has gained immense popularity following a recent blacklisting by the Pentagon, prompted by ethical concerns regarding its use. This spike in interest catalyzed Claude’s ascent to the top position on Apple’s chart of free apps in the United States. Just a day before this, the Pentagon opted for OpenAI’s ChatGPT for its classified military networks, leading to a remarkable dethroning of ChatGPT by Claude on Saturday.
In the UK, Claude also made significant strides, climbing the iPhone app rankings; however, it did not surpass ChatGPT. Additionally, it surged on Android app charts in both the US and UK, despite ChatGPT maintaining its dominance there as well, based on statistics from Sensor Tower.
However, the meteoric rise of Claude was not without issues. Early Monday, Claude and other applications from the startup Anthropic experienced outages, which the company attributed to “unprecedented demand for Claude” over the previous week. Reports indicated that over 1,400 users encountered service disruptions shortly after 6 AM ET, as indicated by Downdetector, a platform dedicated to tracking service outages. By 11 AM ET, Anthropic announced that the issue had been resolved.
Despite its ongoing dispute with the Pentagon, Anthropic thrived. According to the company, “Every single day last week was an all-time record for Claude sign-ups.”
In a recent turn of events, US Defense Secretary Pete Hegseth labeled Anthropic a supply-chain risk after CEO Dario Amodei stood firm on critical ethical boundaries concerning the use of AI technology for mass surveillance and fully autonomous weapons. Amodei expressed concerns that current AI models aren’t reliable enough for such purposes and raised alarms about the implications of mass surveillance on constitutional rights. He has also challenged the government’s classification of Anthropic as a supply chain risk, stating that their use remains unaffected for clients and Pentagon contractors.
Tensions escalated further as the federal government accused Anthropic of overstepping. Donald Trump took to his Truth Social platform to assert: “The Leftwing nut jobs at Anthropic have made a DISASTROUS MISTAKE trying to STRONG-ARM the [Pentagon], and force them to obey their Terms of Service instead of our Constitution.” Following this, the Trump administration brought OpenAI’s ChatGPT into the fold.
On a related note, Sam Altman, CEO of OpenAI, declared on a Friday that his company had reached an agreement with the federal government mere hours after Anthropic’s negotiations with the Pentagon fell apart. Altman noted that the military would not leverage ChatGPT for autonomous killing systems or mass surveillance. However, many AI experts, lawyers, tech workers, and users remain skeptical, questioning why the US government would shift its allegiance from Anthropic to OpenAI while dismissing similar safeguards.
The controversy has influenced public perception, with some ChatGPT users, including pop singer Katy Perry, openly declaring their transition from ChatGPT to Anthropic, encouraging others to cancel their subscriptions as well.
As a result of the recent developments, Anthropic has started the year on a high note, reporting a massive increase of over 60% in free active users and a fourfold increase in daily signups. Additionally, Claude’s paid subscriber base has more than doubled.
To facilitate the transition for new users, Anthropic has introduced a memory feature, accessible across all paid plans. This feature allows users to quickly import their previous interactions, ensuring that their first conversation feels seamlessly continuous. As noted by the company on its website, “With one copy-paste, Claude updates its memory and picks up right where you left off.” To assist users in making the switch to Claude, a detailed guide is available, providing prompts for those considering leaving their former AI provider.
Interested in growing your brand with smarter solutions? Get in touch with Auctera today.
