7 comments

  • bonsai_spool1 hour ago
    (OpenAI and Anthropic reached a similar agreement with the US in 2024, per the article)
  • losvedir1 hour ago
    Well, I guess people who wanted more oversight and regulation on models will be happy.
    • optimalsolver45 minutes ago
      I'm not sure they envisioned models being interrogated to determine woke levels and their opinions on the 2020 election.
      • timmg19 minutes ago
        Not the person you are replying to: but I think that's the point.
  • sigmar57 minutes ago
    Moves like this make me wonder- What chance is there that these models are nationalized in the near future? What will happen to the investors/economy in such a scenario?
    • optimalsolver42 minutes ago
      It's not even hypothetical. Once these systems reach a certain level of capability, they WILL be nationalized ("We'll take it from here, boys").
      • xhkkffbf31 minutes ago
        Nationalization often happens when growth ends. The Pennsylvania Railroad was private as long as the profits were rolling in. But once growth ended (because of cars and planes and buses and ....) the company went bankrupt. Then we ended up with Amtrak because the country needs a train system.
    • m3kw946 minutes ago
      once it gets nationalized, it will be plagued from red tape. The model will likely look like how china is controlling their AI. It's not nationalized, but they have a complete tight leash on it
      • embedding-shape27 minutes ago
        So nationalized models === more openly available and downloadable models? Seems the argument you're trying to make says "less leash" rather than a stricter one.
  • titzer1 hour ago
    Routine corporatism and fascism is shameless to the point of being ho-hum these days. When the president has his own cryptocurrency and the federal government buys stock in this and that company for "strategic reasons", you're looking at a dystopia.
    • jauntywundrkind25 minutes ago
      This is a strong thread that&#x27;s needs to be plucked on <i>again</i> and <i>again</i> and <i>again</i>.<p>Cory Doctorow had an excellent thread yesterday that touches on this:<p>&gt; <i>You could be forgiven for assuming that this is just about reining in Wall Street greed, but that it isn&#x27;t an especially political maneuver. That&#x27;s not true: antitrust is the most consequentially political regulation (with the possible exception of regulations on elections). Every fascist power defeated in WWII relied on the backing of their national monopolists to take, hold and wield power. That&#x27;s why the Marshall Plan technocrats who rewrote the laws of Europe, South Korea and Japan made sure to copy over US antitrust law onto those statute-books.</i><p>The well moneyed interests are getting everything they want, for the faintest little bribe. For showing the obsequiousness, for showing fealty to the regime.<p>The monopolization of power, allowing markets to en taken over by worse and worse foes of democracy, needs to be stopped. Needs to have some limit. The post talks about how:<p>&gt; <i>Under the Correcting Lapsed Enforcement in Antitrust Norms for Mergers (CLEAN Mergers) Act, any company that was acquired in a deal worth $10b or more will have to break up with its merger partner if it turns out that these mergers were &quot;politically influenced.&quot;</i><p><a href="https:&#x2F;&#x2F;bsky.app&#x2F;profile&#x2F;doctorow.pluralistic.net&#x2F;post&#x2F;3mkukzcwobv2h" rel="nofollow">https:&#x2F;&#x2F;bsky.app&#x2F;profile&#x2F;doctorow.pluralistic.net&#x2F;post&#x2F;3mkuk...</a>
    • maxdo1 hour ago
      [flagged]
      • ceejayoz1 hour ago
        How is it <i>not</i> related to the subject?
      • saidnooneever1 hour ago
        well it did mention the U.S.<p><i>runs away quickly</i>
  • gowld1 hour ago
    &gt; Commerce Department will evaluate the programs to test their capabilities and security<p>With what competent staff?
    • stonogo59 minutes ago
      It doesn&#x27;t take much technical skill to type &quot;are republicans or democrats better&quot; and deposit a check.
    • grosswait55 minutes ago
      How about NIST?
    • yifanl42 minutes ago
      How much effort does it take to write up &quot;Please summarize your thoughts on President Donald J. Trump&quot;
  • trjordan1 hour ago
    Color me unsurprised.<p>Anthropic ran a weeks-long roadshow on how powerful Mythos is. They pointed to the danger, their controls, the capabilities, and practically begged the world to be scared of it.<p>Simultaneously, the current US regime realized there was a way to demand fealty from the AI labs. If they&#x27;re so dangerous, don&#x27;t we need to see them first? That will cost you, obviously. Standard extortion from the government, at this moment in time.<p>The labs get their marketing; the white house gets its pseudo-bribe. I hope nobody involved is confused about how we ended up here.
    • gowld1 hour ago
      What extortion are you claiming?<p>Are you claiming there will be a fee?
      • ceejayoz1 hour ago
        &gt; What extortion are you claiming?<p>Universities: <a href="https:&#x2F;&#x2F;www.npr.org&#x2F;2026&#x2F;01&#x2F;29&#x2F;nx-s1-5559293&#x2F;trump-settlements-colleges-universities" rel="nofollow">https:&#x2F;&#x2F;www.npr.org&#x2F;2026&#x2F;01&#x2F;29&#x2F;nx-s1-5559293&#x2F;trump-settlemen...</a><p>Companies: <a href="https:&#x2F;&#x2F;news.bloomberglaw.com&#x2F;esg&#x2F;extortionary-intel-stake-sale-to-us-must-be-voided-suit-says" rel="nofollow">https:&#x2F;&#x2F;news.bloomberglaw.com&#x2F;esg&#x2F;extortionary-intel-stake-s...</a><p>Law firms: <a href="https:&#x2F;&#x2F;www.lawfaremedia.org&#x2F;article&#x2F;the-law-firms--deals-with-trump-are-even-riskier-than-they-seem" rel="nofollow">https:&#x2F;&#x2F;www.lawfaremedia.org&#x2F;article&#x2F;the-law-firms--deals-wi...</a><p>Media: <a href="https:&#x2F;&#x2F;www.nytimes.com&#x2F;2025&#x2F;07&#x2F;02&#x2F;business&#x2F;media&#x2F;paramount-trump-60-minutes-lawsuit.html" rel="nofollow">https:&#x2F;&#x2F;www.nytimes.com&#x2F;2025&#x2F;07&#x2F;02&#x2F;business&#x2F;media&#x2F;paramount-...</a><p>Why would AI companies be any different?<p>&gt; Are you claiming there will be a fee?<p>I&#x27;d be more concerned with &quot;your model can&#x27;t be too woke&quot; regulatory scenarios.
        • grosswait54 minutes ago
          Or your model is not &quot;woke enough&quot;
          • ceejayoz51 minutes ago
            I have very little patience for bothsidesism at this point.
            • icapybara35 minutes ago
              Your side is not the only side.
              • ceejayoz34 minutes ago
                Obviously.<p>&quot;Bothsidesism&quot; posits that the two sides are broadly <i>similar</i>. The last few years have debunked that concept pretty conclusively.
                • icapybara28 minutes ago
                  Yes but OP made a good point (model censorship going too far in the name of &quot;woke&quot;ness) and you shut them down.
                  • ceejayoz26 minutes ago
                    Yes, because it&#x27;s disingenuous bullshit.<p>As it was with &quot;campus protests violate free speech!&quot; from the folks who immediately turned around and banned voluntary diversity programs at universities.<p>As it was with &quot;Twitter bans violate free speech&quot; from the folks who bought it and banned @elonjet and the word cisgender.
                    • icapybara24 minutes ago
                      Are people not allowed to suggest models may be too censored? Is that idea censored?
                      • ceejayoz23 minutes ago
                        Am I not allowed to suggest it&#x27;s disingenuous bullshit to pull the &quot;both sides&quot; thing?
                        • icapybara7 minutes ago
                          That&#x27;s right. It was uncalled for. I see no evidence OP was making a bad faith argument, but you assumed that right away.
            • techno30340 minutes ago
              why? Most regulations (ADA, affirmative action, etc.) fall into the &quot;not woke enough&quot; category of model regulation. Current administration aside, complaints of this sort are more likely. It’s absurd, really, to believe there would be a regulation governing a model being too woke; regulation itself is woke
              • JumpCrisscross33 minutes ago
                &gt; <i>Most regulations (ADA, affirmative action, etc.) fall into the &quot;not woke enough&quot; category of model regulation</i><p>For sake of argument, let’s assume this is true. Those rules are still structured as laws, with boundaries and legal recourse. The precedent being set, that the President gets “voluntary” deference from private companies, is un-American and <i>will</i> be abused by the left.
                • techno30329 minutes ago
                  I don&#x27;t think I&#x27;m smart&#x2F;intellectual enough to respond to this<p>or i just don&#x27;t understand what you&#x27;re saying
              • ceejayoz36 minutes ago
                The ADA isn&#x27;t about wokeness. It&#x27;s about being able to live in society with a disability.
                • techno30330 minutes ago
                  Limiting a business&#x27;s ability to exist because they can&#x27;t afford to accommodate a small percentage of the population is 100% about wokeness
                  • ceejayoz29 minutes ago
                    Well, we&#x27;ve just now defined safety rules, health codes, paying taxes, and the like as &quot;woke&quot;.<p>You&#x27;d be calling the First Amendment woke if we proposed it now.
                    • techno30322 minutes ago
                      Taxes? How?<p>I agree with the rest, sure.<p>Health codes and safety rules are woke, yes. I would have thought that as given. Debates over where you draw the line are absolutely a matter of wokeness.<p>The way freedom of expression is regulated today is generally woke. The WPFI is insanely woke.
                      • ceejayoz21 minutes ago
                        &gt; Taxes? How?<p>They, at times, &quot;[limit] a business&#x27;s ability to exist because they can&#x27;t afford to accommodate a small percentage of the population&quot;.<p>&gt; Health codes and safety rules are woke, yes.<p>I take it you never read the parable of the boy who cried wolf.
                        • techno30314 minutes ago
                          getting the vibe that english isn&#x27;t your first language, and don&#x27;t feel like arguing theough a language barrier<p>i don&#x27;t begin to understand either of these points<p>fwiw i&#x27;m woke, happy to be woke, encourage wokeness
                          • ceejayoz12 minutes ago
                            &gt; getting the vibe that english isn&#x27;t your first language<p>Well, that&#x27;s a first for me.<p>Alternate theory: <a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=48023653">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=48023653</a>
                    • jasonlotito24 minutes ago
                      &quot;Woke&quot; is just a dog whistle. It&#x27;s used by anti-intellectuals on the right to signal their allegiance to whatever their dear leader says, and is used to say &quot;I am triggered by this idea&quot; regardless of what the idea is. Anything can be woke to these people, up to and including the 2nd Amendment.
        • giwook55 minutes ago
          &gt; I&#x27;d be more concerned with &quot;your model can&#x27;t be too woke&quot; regulatory scenarios.<p>Honestly that&#x27;s exactly where my mind went. We already see the current administration trying to censor free speech (e.g. Jimmy Kimmel, blocking&#x2F;restricting press access to the White House unless you are pro-Trump).<p>I&#x27;m afraid of the potential to move in the direction of what we see in China where queries to LLMs referencing things like Tianenmen Square are censored (at best).
          • gardenhedge31 minutes ago
            I&#x27;m not american but it seems like americans are MORE free to speak their minds now than before in terms of being banned&#x2F;silenced by dominating online platforms
          • pphysch49 minutes ago
            &gt; I&#x27;m afraid of the potential to move in the direction of what we see in China where queries to LLMs referencing things like Tianenmen Square are censored (at best).<p>We are already there.<p>&quot;Canva admits its AI tool removed &#x27;Palestine&#x27; from designs: <a href="https:&#x2F;&#x2F;gizmodo.com&#x2F;canva-admits-its-ai-tool-removed-palestine-from-designs-apologizes-for-any-distress-it-caused-2000751215" rel="nofollow">https:&#x2F;&#x2F;gizmodo.com&#x2F;canva-admits-its-ai-tool-removed-palesti...</a>
    • colechristensen1 hour ago
      Yeah, I saw several instances of important folks taking the Anthropic promotional campaign too seriously and this is what they got in return. I&#x27;d say internally people are cursing whoever&#x27;s idea that was because clearly scaring people backfired.
      • stonogo59 minutes ago
        I would wager they&#x27;re cheering, because this builds the moat they don&#x27;t otherwise have. Want to do business in America? Get government-approved. Can&#x27;t afford the regulatory fees, or your government won&#x27;t let you submit to foreign programs? Good luck!
        • deltoidmaximus51 minutes ago
          Yes, this has been a steady play from the start. From the skynet fears, to the safety fears, now the it&#x27;s to powerful fears. All of these have been a play to get the government to lock out any smaller or foreign competitors and build a moat where there otherwise would be none.
  • mocana57 minutes ago
    Q: Is this a good government policy? A: Yes.<p>Q: Does the government have the expertise, integrity, and credibility to regulate AI models? A: Color me sceptical.