I hope that Gary reads this comment because an important correction is needed. The Block Nuclear Launch by Autonomous AI Act was passed as part (last sentence) of Section 1638 of the FY2025 NDAA:<p>SEC. 1638. Sense of Congress with respect to use of artificial intelligence to support strategic deterrence.<p>(a) Sense of Congress.--It is the sense of Congress that--<p>(1) the considered use of artificial intelligence and machine learning tools presents opportunities to strengthen the security of critical strategic communications and early warning networks, improve the efficiency of planning processes to reduce the risk of collateral damage, and enhance U.S. capabilities for modeling weapons functionality in support of stockpile stewardship; and<p>(2) even with such applications, particular care must be taken to ensure that the incorporation of artificial intelligence and machine learning tools does not increase the risk that our Nation's most critical strategic assets can be compromised.<p>(b) Statement of policy.--It is the policy of the United States that the use of artificial intelligence efforts should not compromise the integrity of nuclear safeguards, whether through the functionality of weapons systems, the validation of communications from command authorities, or the principle of requiring positive human actions in execution of decisions by the President with respect to the employment of nuclear weapons.
The Dune reference earlier in the thread is spot on. We often think of the Butlerian Jihad as a fight against sentient robots, but Herbert's core warning was actually about humans delegating their thinking and agency to machines. We are seeing that play out now not through some sci-fi uprising, but through the quiet erosion of accountability. When we let an algorithm decide who gets a loan or who is targeted in a conflict, we are not just using a tool. We are essentially offloading our moral responsibility to a black box that cannot be held accountable.
>We are essentially offloading our moral responsibility to a black box that cannot be held accountable.<p>We already did it with companies, buddy!
Except that companies are <i>not</i> a black box, at every step there is a human making a comprehensible decision (probably with a paper trail). Yes, they dilute accountability to nearly nothing in some cases, but LLMs are sufficiently opaque to claim (ingenuously) that “nobody is responsible”.
We don't need to look to science fiction for lessons about where this will lead. Fact has already caught up.<p>The IDF used tools like Lavender and "Is Daddy Home" to analyze communications, identify members of Hamas, and learn when they were home so they could be killed with bomb strikes.<p>This has long been possible for humans to do, but it's a laborious process. In the past, only people high up in chains of command received such bespoke treatment. AI tools permitted the IDF to grant the same treatment to raw recruits who had been given the sum total of a pep-talk and a pistol.<p>The result was widespread destruction and indiscriminate killing of civilians. The IDF didn't spend much time scrutinizing AI recommendations and were willing to act on false positives. Every bomb strike, by design (i.e. "Is Daddy home"'s purpose was to determine when targets were in their family homes), took out civilians. Just taking a pizza order from a Hamas member years before the war might have been enough to get entire families <i>and their neighbours</i> killed.<p>If humans hate another group of humans enough and an AI says "Kill", they'll kill. Without thought or remorse. We don't merely need to be worried about murderous robots on battlefields, we also need to worry about humans implementing the recommendations of AI without thinking for themselves.
“A computer can never be held accountable, therefore a computer must never make a management decision.”
To be fair, AI will likely be more moral then Hegseth, Vance and Trump combo. At worst, it will be as bad as them.
The US, and probably every country on earth, has a law where they can claim something as necessary for defense and take it (with or without compensation, in the US it's with).<p>So I don't really think this spat proves anything at all.<p>Honestly trying to throw in the article that AI may be responsible for nuclear weapons also throws this whole article into junk opinion tier. Just trying to riot up support.<p>Though it is funny another commenter already found they passed a law where AI can't launch nuclear weapons last year.
The fact that there's a law that says then can do it doesn't make it just or moral.
They have Grok, why do they need to bully Anthropic?
Literally the worst administration possible during the AI singularity :).
The title could have been a line out of Dune.<p>It’s interesting that many of us myself included once thought that the butlerian jihad was silly until now. Frank Herbert wrote something that is particularly prescient.<p>(Usually writers are just a decade ahead of their time. Whatever Podcasters are talking about today, has usually already been discussed in literature a decade ago. Prediction markets come to mind. Socially, over vs under population as discussed in popular books like the rationale optimist or the accidental superpower.)
Not exactly relevant, but I couldn't read this without verifying my age. First time that has happened.<p>I'm really surprised Substack thinks Australia's social media laws apply to them.<p>(And no, I'm not willing to do that just to read an article.)
metalman's comment is great as well but if you are interested here is the archive.org link to the article.<p><a href="https://web.archive.org/web/20260226214404/https://garymarcus.substack.com/p/america-and-probably-the-world-stands" rel="nofollow">https://web.archive.org/web/20260226214404/https://garymarcu...</a><p>I think this should just go work. Hope this helps.
I did not have to verify anything, so here is the text<p>Marcus on AI<p>America, and probably the world, stands on a precipice.
Call your Senators and Representatives, right now.
Gary Marcus
Feb 26, 2026
As I wrote here yesterday, Anthropic’s showdown with The US Department of War may literally be life or death for all of us. If Pete Hegseth forces Dario Amodei to fold, not one but two monstrous precedents will be set.<p>The first is obvious, and terrible in itself. The second is subtle but no less important.<p>The first is that what Secretary Hegseth is demanding, backed by heavy threats, is that the US military have full, unrestricted access to Anthropic’s AI software, for applications such as military surveillance and autonomous weapons without humans in the loop. This could well extend to nuclear weapons.1<p>Nothing that I have read convinces me that Secretary Hegseth has a nuanced understanding of the strengths and limits of current AI, or that he will show restraint in how he applies it. Rather he is trying to define his career in part around deploying AI as broadly and as quickly as possible.<p>The second is that Hegseth’s maneuver is an audacious power grabs that aims to circumvent Congress. By setting a deadline of 5:01 PM eastern tomorrow, Hegseth aims to cut everybody else – even Congress — out of the loop.<p>A reader of this newsletter, a tech writer who describes himself as a political independent just wrote to me, rightfully panicked:<p>Today the Pentagon will force Anthropic to change their corporate goal of responsible AI. This is not something to be decided in the marketplace by a bully with deep pockets; it must be decided in Congress. Senators and Congressmen must take a position and deliberate in public about whether it is OK to use AI for surveillance of Americans and to launch lethal strikes controlled by AI with no “human in the loop”. Please say something today, before Amodei has to surrender.<p>He is right.<p>Please call or write your Senators and Representatives right now.<p>§<p>AI policy, especially of this magnitude, is something that American people should have a say in. Congress should deliberate. Mass surveillance and AI-fueled weapons, possibly nuclear, without humans in the loop are categorically not things that one individual, even one in the Cabinet, should be allowed to decide at a gunpoint.<p>But that is exactly where we are headed.<p>1
Hegsseth’s demand would in principle extend to apply Anthropic’s software to nuclear weapons without humans in the loop. In that connection, people should probably be aware of the fact that S. 1394 - Block Nuclear Launch by Autonomous Artificial Intelligence Act of 2023 failed to pass. Which means we might not have a Stanislav Petrov next time around.<p>Discussion about this post
Write a comment...
Richard Self
7h<p>Given the unreliability of GenAI in everything that it do, the use in unsupervised warfare will be catastrophic.<p>Reply
Share
1 reply
Roman's Attic
7h<p>From what I’ve heard, these calls are especially valuable if you emphasize that this is an issue that will determine how you vote in the future<p>Reply
Share
1 reply
58 more comments...<p>No posts<p>Ready for more?
Type your email...
Subscribe
© 2026 Gary Marcus · Privacy ∙ Terms ∙ Collection notice
Start your Substack
Get the app
Substack is the home for great culture
This site requires JavaScript to run correctly. Please turn on JavaScript or unblock scripts
This is a reminder that we do not need super intelligence to risk catastrophic outcomes. All that we’ve needed is someone willing to delegate the decision to kill to a machine.
GM is worth less than old ladies writing the Enquirer about their visits with Elvis and Bigfoot on alien spaceships. At least the Enquirer can be entertaining.<p>Don't get suckered by his takes on things; he's nothing but a credentialed blowhard pimping his pedigree for clout and cash. He's been furiously slinging arrows from the peanut gallery for years and it's gross.
This is a prescient warning about yet another abuse of power by the Trump administration that we would all do well to heed, so I'm sure it will be flagged to death within an hour or two. Many people here like to consider themselves "above politics" and "apolitical", thinking that if they just ignore all noise they can just keep living their lives as if this is all normal. But it is not normal. I think the priority of the "Secretary of War" Pete Hegseth here is likely political surveillance and suppression at home, not autonomous nuclear weapons, but he is a known alcoholic who has repeatedly shown terrible judgement so I can't really rule anything out.<p>This administration is criminal and has repeatedly shown disdain for the fundamental rights enshrined in the constitution. Our democratic system is only as strong as the will of the people to protect it.<p>The call to action at the end of the article is good. I think it's worth calling your senators and house reps to speak up about this. Politicians in general are slimy and spineless, those seem to be traits required to succeed in politics, but as a consequence they do tend to notice when people really speak up and care about issues. As a survival instinct if nothing else.
1. An oracle delivers a prophecy.<p>2. The hero takes action to avoid it.<p>3. Those very actions are what make the prophecy come true.<p>Peter Thiel has all these paranoid delusions about the future. (Dude, I know you are listening, if you are scraping my Hacker News posts we seriously need to have a conversation about this!) Yet, his actions trying to prevent this future he is terrified of are the very things that will bring it to fruition. Hamlet is correct -- we defy augury.
Now you see the real reason behind changing the name from Department of Defense to Department of War. It's what Republicans really want.
I don’t like most of the content from this guy, but he’s right about this one. If you can abuse your political position to turn private corporations into your slaves to abuse others, our political system is broken. Especially when you outsource actions that the government would itself be restricted from.
> <i>our political system is broken</i><p>I mean, I agree with you, but if you've only realized that now, you've missed out on some really weird stuff going on for the past couple of years.
Serious question, do you think that the government shouldn't have authority over corporations in its jurisdiction? Not that this is necessarily the right way to do it, but this is rhetoric that would make a devout libertarian blush. Ultimately the government is threatening to nationalize what is new a key wartime industry in preparation for war, that does not seem to me like an abuse of power.
Welcome to the return of history. This is hardly the first time or industry where the US government has forced compliance that wasn't necessarily in the public interest.<p>And the corporations won't fight this. They're in it for the money and they're willing to bring actual gold bars to White House to ensure it keeps rolling in. They know what they're doing is corrosive and debasing, the more conscientious of them probably want to vomit on the inside. But they mostly suck it up and do it anyway, for their investors will discipline them if they don't.<p>Either people run candidates and vote for the ones that campaign on stopping this, or it happens.
> for their investors will discipline them<p>Worse: they'll be sacked and replaced with someone who will.<p>Like Trump's FCC chair was saying he'll revoke the license of stations that make republicans look bad. Those stations will then be replaced with more copies of Newsmax. CBS either toes the line or it gets shut down and replaced by a station that will.
> the return of history<p>The idea that somehow the current actions are 'real' history and what people were doing before is fake just feeds the claim of inevitabiility, a basic psyops maneuver - you can't win; our victory is inevitable.<p>People have <i>made</i> history for centuries of Enlightenment - the whole idea is that we can control our fates as individuals through reason and compassion (humanism), and we have done it. We have transformed the world. The only problem is people giving up - despite the incredible success of this idea over centuries - and accepting that they can't control their fate. Certainly MAGA-ish conservatives believe they can make history.
It’s not the type of behavior that you find in nations operating based on the rule of law. It’s emblematic of where the things are heading for the United States. A rapid descent into fascism (call it what it is).
This seems like a classic case of "the boy who cried wolf". Almost everything Gary Marcus says has been trivially dismissable, and often soon proven wrong.<p>If someone like that wants to be in a position to warn society of <i>actual</i> harms, they'd have to behave differently.
[flagged]
The real AI has been in action for centuries now.<p>It's capitalism.<p>Capitalism is a functioning AI that controls the world, that has had humans serving it. AI is the tool that capitalism will use to remove humans from the equation.<p>The end state of capitalism is slave labor. The end state of technocapitalism removes humans from the labor equation.
[flagged]
Why do you bring up communism? It's not true that a society either has to organize as capitalist or communist.
Are people also slaves under capitalism then? "Their work is owned by their employer, not themselves".
I hate communism as much as the next guy, but you couldn't describe it more incorrectly even if you tried I think.
That is Literally the definition: society based on common ownership of the means of production<p>They try to hide that away by flowering you with "everything is free" and "communal this and that" but when it comes down to it, the very exact definition is you don't own your labor.
So like my apartment complex would have a power drill in the basement that anyone can borrow to drill holes in their walls and that means I'm a slave?
No.<p>Communism distinguishes between private and personal property. Your power drill is personal property. Private property is used to generate profit, in other words, is "the means of production." Private property is communal, personal property is not.<p>As an aside, just <i>once</i> I wish I could see a criticism of communism on this forum made in good faith by someone who actually bothered to read a word of theory. Actual goddamn communists and socialists are more than capable of criticizing themselves, it's kind of what they do (and why they never get anywhere.) But capitalists can never come with anything deeper than "communism is slavery lol."
Do you know what "the means of production" actually is?
Common ownership of the means of production literally means you own the place you work for, communally.
Which part of "common ownership" means that you somehow don't own it? The whole idea is that you do own it - in part, but enough to get a say in what to do with it.<p>"It" being the means of production - factories, data centers, farms. Not your personal belongings.
So you're just upset that you can't own it alone, then? Jeff is that you? Be honest :)<p>Seriously though: I don't know what the latest numbers are, but I think we are around 15 people owning ~50% of the global wealth, iirc. So I don't know about you, but unless you are a billionaire, common ownership of the means of production sounds pretty great to most people.
>>> In communism the beginning state is slave labor. (your work is owned by the state, not yourself)<p>> That is Literally the definition: society based on common ownership of the means of production<p>So, you're saying in communism, the state is owned by you, and your work is owned by the state. And that somehow is more slave-like than when your work is owned by an employer-organization that's owned by some boss?<p>> They try to hide that away by flowering you with "everything is free" and "communal this and that" but when it comes down to it, the very exact definition is you don't own your labor.<p>It's not any worse than a capitalist company town.<p>And honestly: a nation than banned private business and mandated that all workplaces be worker-owned cooperatives sounds both pretty communist to me <i>and</i> in no way worse to labor than our current system.