Report: US Military Deployed Anthropic’s AI Model Claude in Venezuela Operation

Claude, an advanced AI model developed by Anthropic, was reportedly utilized by the US military during a covert operation aimed at the abduction of Nicolás Maduro from Venezuela. This significant event, highlighted by the Wall Street Journal, showcases the increasing reliance of the US Department of Defense on artificial intelligence for various military operations.
The operation in Venezuela involved extensive airstrikes throughout the capital, Caracas, resulting in the tragic deaths of 83 individuals, according to the Venezuelan defense ministry. It’s important to note that Anthropic’s official terms of use explicitly prohibit the utilization of Claude for any violent purposes, weapon development, or surveillance activities.
Intermediary sources have revealed that Anthropic stands as the first known AI developer to support a classified operation conducted by the US military. However, specific details surrounding how Claude was utilized during this mission remain unclear. The AI’s capabilities range from processing complex documents to assisting in the autonomous operation of drones, leaving many to speculate about its actual application in the mission.
A representative from Anthropic chose not to provide comments regarding whether Claude was indeed employed in the mission but confirmed that any use of their AI technology must adhere to its strict usage guidelines. Meanwhile, the US Department of Defense has remained silent on the allegations made regarding this particular operation.
According to reports from the WSJ, anonymous sources indicated that Claude was made operational through Anthropic’s collaboration with Palantir Technologies, a prominent contractor serving both the US military and various federal law enforcement agencies. Palantir has also refrained from commenting on these claims.
The trend of military utilization of artificial intelligence is not confined to the US alone, as numerous countries increasingly integrate AI into their defense strategies. For instance, the Israeli military has deployed autonomous drones in Gaza and heavily relies on AI to enhance its targeting database within that region. The US military has also employed AI-driven targeting systems for airstrikes in both Iraq and Syria in recent years, illustrating a broader shift toward technologically advanced warfare.
Critics of this evolving military landscape have raised significant concerns regarding the implications of integrating AI into weapon systems, particularly the potential for erroneous targeting decisions made by machines programmed to make life-and-death choices. This begs the question of accountability in situations where human oversight is lessened or entirely absent.
In light of these concerns, numerous AI developers are currently navigating the complex relationship between their technologies and the defense sector. Anthropic’s CEO, Dario Amodei, has voiced the necessity for regulatory measures to prevent adverse consequences arising from the application of AI in military operations. He has repeatedly expressed apprehension about the role of AI in conducting autonomous lethal activities and surveillance within the context of US military engagements.
This cautious approach, however, appears to have caused friction with officials at the US Department of Defense. Secretary of War Pete Hegseth articulated his frustration in January, asserting that the department would not adopt AI models that are perceived as hindering military efficacy or operational capabilities.
In January, the Pentagon disclosed its collaboration with xAI, a company founded by Elon Musk. They also utilize a tailored version of Google’s Gemini and the systems developed by OpenAI to bolster their research and operational capabilities. This development indicates a growing intersection of advanced AI technologies and military applications, raising questions about ethical use, safety, and oversight.
Interested in growing your brand with smarter solutions? Get in touch with Auctera today.
