{"id":46860,"date":"2026-03-09T21:32:44","date_gmt":"2026-03-09T17:32:44","guid":{"rendered":"https:\/\/azat.tv\/en\/?p=46860"},"modified":"2026-03-15T21:52:04","modified_gmt":"2026-03-15T17:52:04","slug":"anthropic-sues-us-government-national-security-risk","status":"publish","type":"post","link":"https:\/\/azat.tv\/en\/anthropic-sues-us-government-national-security-risk\/","title":{"rendered":"Anthropic Sues U.S. Government Over National Security Risk Designation"},"content":{"rendered":"<div style=\"background: #f7fafc; padding: 15px;\">\n<p><strong>Quick Read<\/strong><\/p>\n<ul>\n<li>Anthropic has filed federal lawsuits against the Trump administration challenging its designation as a supply-chain risk.<\/li>\n<li>The AI firm alleges the government&#8217;s actions violate its First Amendment rights and bypass proper legal procedures.<\/li>\n<li>The dispute centers on the Pentagon&#8217;s use of Anthropic&#8217;s AI model, Claude, and the company&#8217;s restrictions on its application.<\/li>\n<\/ul>\n<\/div>\n<p><b>WASHINGTON (Azat TV) \u2013<\/b> Artificial intelligence company Anthropic has filed federal lawsuits against the Trump administration, challenging the unprecedented designation of the firm as a risk to the Defense Department\u2019s supply chain. The AI startup alleges the move violates its First Amendment rights, exceeds the scope of relevant statutes, and bypasses proper procedures for canceling government contracts.<\/p>\n<h2>Anthropic Alleges Unlawful Retaliation and Rights Violations<\/h2>\n<p>In its filings in the U.S. District Court for the Northern District of California, Anthropic\u2019s legal team stated that the company is turning to the judiciary \u201cas a last resort to vindicate its rights and halt the Executive\u2019s unlawful campaign of retaliation.\u201d The lawsuit names several federal agencies and cabinet officials, including the Defense Department and Defense Secretary Pete Hegseth, as defendants. Anthropic also indicated plans to file a separate suit in the U.S. Court of Appeals for the D.C. Circuit.<\/p>\n<p>The dispute stems from a disagreement over the Pentagon\u2019s use of Anthropic\u2019s AI model, Claude. Anthropic CEO Dario Amodei had previously informed Secretary Hegseth that the company would not permit Claude\u2019s use for surveilling American citizens or for autonomous weapons. In response, Hegseth reportedly threatened to label Anthropic a supply-chain risk, a designation typically reserved for entities with ties to U.S. adversaries.<\/p>\n<p>The Pentagon formally designated Anthropic a supply-chain risk on Wednesday, following a social media post from President Donald Trump directing federal agencies to cease using Claude, which has reportedly been employed by the Pentagon in military operations, including in Iran.<\/p>\n<h2>Legal Challenges to Supply-Chain Designation<\/h2>\n<p>Anthropic\u2019s lawsuit argues that statements made by President Trump and Secretary Hegseth indicate the government\u2019s intent to suppress constitutionally protected speech. \u201cThe Constitution confers on Anthropic the right to express its views \u2014 both publicly and to the government \u2014 about the limitations of its own AI services and important issues of AI safety,\u201d the company\u2019s lawyers asserted.<\/p>\n<p>Furthermore, the company contends that the government has overstepped the legal boundaries of the supply-chain risk designation statute. Anthropic argues that this law is intended for situations where foreign adversaries might sabotage national security systems, and that the government has not established such a risk concerning Anthropic. The firm also claims that Trump and Hegseth exceeded their authority by attempting to terminate government contracts without adhering to established procurement procedures.<\/p>\n<h2>Government and Company Responses<\/h2>\n<p>Spokespeople for the White House did not immediately respond to requests for comment. A Pentagon spokesperson stated that the department does not comment on ongoing litigation. White House spokesperson Liz Huston released a statement asserting that the president \u201cwill never allow a radical left, woke company\u201d to dictate military operations, adding, \u201cUnder the Trump Administration, our military will obey the United States Constitution \u2013 not any woke AI company\u2019s terms of service.\u201d<\/p>\n<p>Anthropic stated it wishes to continue negotiations with the government, with a spokesperson noting, \u201cWe will continue to pursue every path toward resolution, including dialogue with the government.\u201d However, the company also alleged in its suit that the punitive actions are \u201charming Anthropic irreparably.\u201d This statement appears to contrast with earlier remarks by CEO Dario Amodei, who reportedly told CBS News that the impact of the designation was \u201cfairly small\u201d and the company would \u201cbe fine.\u201d Claude has been integrated into the Department of Defense over the past year and was reportedly the only AI model approved for classified systems, with extensive use in military operations.<\/p>\n<p><em>The legal battle highlights the growing tension between national security concerns, the ethical development of artificial intelligence, and the government&#8217;s ability to regulate or restrict the use of powerful AI technologies by domestic companies.<\/em><\/p>\n","protected":false},"excerpt":{"rendered":"<p>AI firm Anthropic has filed federal lawsuits against the Trump administration after being designated a supply-chain risk.<\/p>\n","protected":false},"author":1,"featured_media":-1,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"googlesitekit_rrm_CAow5Nm1DA:productID":"","footnotes":""},"categories":[5],"tags":[447,2177,1564,1253,3024,8095],"class_list":["post-46860","post","type-post","status-publish","format-standard","hentry","category-legal","tag-anthropic","tag-claude-ai","tag-national-security","tag-pentagon","tag-trump-administration","tag-us-government"],"featured_image_url":"https:\/\/azat.tv\/wp-content\/uploads\/2026\/03\/anthropic-ai-lawsuit.jpg","_embedded":{"wp:featuredmedia":[{"id":-1,"source_url":"https:\/\/azat.tv\/wp-content\/uploads\/2026\/03\/anthropic-ai-lawsuit.jpg","media_type":"image","mime_type":"image\/jpeg"}]},"_links":{"self":[{"href":"https:\/\/azat.tv\/en\/wp-json\/wp\/v2\/posts\/46860","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/azat.tv\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/azat.tv\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/azat.tv\/en\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/azat.tv\/en\/wp-json\/wp\/v2\/comments?post=46860"}],"version-history":[{"count":0,"href":"https:\/\/azat.tv\/en\/wp-json\/wp\/v2\/posts\/46860\/revisions"}],"wp:attachment":[{"href":"https:\/\/azat.tv\/en\/wp-json\/wp\/v2\/media?parent=46860"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/azat.tv\/en\/wp-json\/wp\/v2\/categories?post=46860"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/azat.tv\/en\/wp-json\/wp\/v2\/tags?post=46860"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}