• 'Orwellian Notion': Federal workers can access Claude AI again af

    From TechnologyDaily@1337:1/100 to All on Thu Apr 9 22:45:25 2026
    'Orwellian Notion': Federal workers can access Claude AI again after judge ditches Trump's Anthropic ban

    Date:
    Thu, 09 Apr 2026 21:35:00 +0000

    Description:
    Federal workers regain Claude access after a judge blocked the Trump administrations unprecedented decision to label Anthropic a supply chain threat.

    FULL STORY ======================================================================Copy link Facebook X Whatsapp Reddit Pinterest Flipboard Threads Email Share this article 0 Join the conversation Follow us Add us as a preferred source on Google Newsletter Tech Radar Pro Are you a pro? Subscribe to our newsletter Sign up to the TechRadar Pro newsletter to get all the top news, opinion, features and guidance your business needs to succeed! Become a Member in Seconds Unlock instant access to exclusive member features. Contact me with news and offers from other Future brands Receive email from us on behalf of our trusted partners or sponsors By submitting your information you agree to the Terms & Conditions and Privacy Policy and are aged 16 or over. You are
    now subscribed Your newsletter sign-up was successful Join the club Get full access to premium articles, exclusive features and a growing list of member rewards. Explore An account already exists for this email address, please log in. Subscribe to our newsletter Federal workers regain Claude access after court blocks controversial designation Judge calls government move unconstitutional retaliation against AI company Anthropic rejects military use, triggering federal access shutdown and backlash Federal employees can
    now log back into Anthropics Claude for Government service after a California federal judge blocked the Trump administration from designating the AI
    company as a supply chain risk.

    US District Judge Rita Lin issued a preliminary injunction, granting Anthropics motion to prevent Defense Secretary Pete Hegseth and the administration from declaring the company a threat. Federal workers at agencies such as the Department of Health and Human Services received emails informing them that access to Claude has been restored, along with any previous conversation history and data. Article continues below You may like Anthropic goes to court over 'supply chain risk' label, as Claude usage rises Trump just banned Anthropic from government use this is how we got here Anthropics Claude models are no longer available: US State Department ditches Claude on the orders of Trump while Senate approves Gemini, ChatGPT, and CoPilot for use Cause of the dispute The conflict erupted in early 2026 after Anthropic refused to allow its Claude AI model to be used for developing lethal autonomous weapons or for mass surveillance of the US population.

    The company stepped away from partnership discussions with the US military over these concerns, which included fully autonomous weapons and mass surveillance capabilities.

    In response, the Trump administration designated Anthropic as a supply chain risk , a move that Anthropic CEO Dario Amodei described as legally unsound.

    This decision by the Trump administration did not stop millions of users from signing up for Claude daily. Are you a pro? Subscribe to our newsletter Sign up to the TechRadar Pro newsletter to get all the top news, opinion, features and guidance your business needs to succeed! Contact me with news and offers from other Future brands Receive email from us on behalf of our trusted partners or sponsors By submitting your information you agree to the Terms & Conditions and Privacy Policy and are aged 16 or over.

    The US government has never applied this designation to a domestic company,
    as it is typically directed at foreign intelligence agencies, terrorists, and other hostile actors.

    Judge Lin used striking language in her 43-page order granting the
    preliminary injunction.

    Nothing in the governing statute supports the Orwellian notion that an American company may be branded a potential adversary and saboteur of the US for expressing disagreement with the government, Lin wrote. What to read next Trump administration says Anthropic refusal was 'not protected speech' in US court These actions are unprecedented and unlawful: Anthropic sues Pentagon over supply chain risk designation claims free speech and due process violations Claude shoots to top of the App Store thanks to 'cancel ChatGPT' trend

    She labeled the administrations actions as classic First Amendment retaliation.

    Lin noted that the designation has never been applied to a domestic company and is directed principally at foreign intelligence agencies, terrorists, and other hostile actors.

    The Department of Defense, which the Trump administration has dubbed the Department of War, has appealed Lins order to the US Court of Appeals for the Ninth Circuit.

    The administration did not ask the appellate court to stay the district court injunction, allowing it to go into effect.

    Anthropic is also asking the US Court of Appeals for the DC Circuit to issue an emergency stay of the Defense Departments supply chain designation.

    The company argues that the administration violated the First and Fifth Amendments of the Constitution.

    This preliminary injunction allows federal workers to access Claude again,
    but the legal fight is far from over.

    Via MLex Follow TechRadar on Google News and add us as a preferred source to get our expert news, reviews, and opinion in your feeds. Make sure to click the Follow button!

    And of course you can also follow TechRadar on TikTok for news, reviews, unboxings in video form, and get regular updates from us on WhatsApp too.



    ======================================================================
    Link to news story: https://www.techradar.com/pro/orwellian-notion-federal-workers-can-access-clau de-ai-after-judge-ditches-trumps-anthropic-ban


    --- Mystic BBS v1.12 A49 (Linux/64)
    * Origin: tqwNet Technology News (1337:1/100)