2 comments

  • paxys1 hour ago
    I bet shareholders of SpaceX are <i>thrilled</i> to be exposed to this for no reason
    • bigyabai1 hour ago
      SpaceX owners do not care. If they were risk-averse, they would have dumped SpaceX like it was toxic waste.<p>In a broader sense, this &quot;I bet Oracle shareholders <i>hate</i> their bad PR&quot; attitude is really zero-sum. It&#x27;s pervasive on HN, and we rarely ever see bad PR snowball beyond niche discussions. I want $BIGCORP to collapse as much as the next guy, but the outrage-derived comments don&#x27;t seem to reflect the market&#x27;s response.
    • manoDev1 hour ago
      You would lose this bet.
  • charcircuit51 minutes ago
    This is the equivalent of suing photoshop when people use it to make something you don&#x27;t like.
    • happytoexplain27 minutes ago
      I don&#x27;t expect this kind of dramatic internet-forum-style false equivalence on HN.<p>Argue your opinion on its own terms, rather than simply pointing to something else and saying &quot;they&#x27;re the same&quot;.
    • catgirlinspace34 minutes ago
      does photoshop have a “make porn of this person” button? does it have a “make me csam” button?
      • charcircuit25 minutes ago
        Nor did Grok. You had to prompt what you wanted to have happen.
        • happytoexplain15 minutes ago
          Typing a description of an image is comparable to finding a button in a menu regarding ease&#x2F;enablement. The difference is actually that Grok decides explicitly to not add even simple (if imperfect) guards, which requires no real burden on their part, which is morally and spiritually hideous.<p>I hope they experience consequences.
        • jdiff12 minutes ago
          And it had blatantly insufficient safety guarding that. I saw photos of 4th graders, labeled accordingly, that Grok happily modified. Given the model does have access to the text content of posts it&#x27;s replying to, that&#x27;s wildly negligent. This is not a simple brush tool we&#x27;re talking about, and refusing requests to strip clothes off people is one of the most basic measures that could be taken here.
      • nofriend31 minutes ago
        It&#x27;s called the &quot;brush&quot; tool.
    • ktoo_21 minutes ago
      Do you really think that when people make things that have risks associated with the use or misuse of those tools that they have no responsibility to mitigate those risks or prevent misuse?
    • anildash32 minutes ago
      Adobe doesn&#x27;t provide tools explicitly designed to enable the creation of child pornography — in fact their tools try to prevent its creation — and they don&#x27;t profit from the sale of it. But, of course, Musk fanboys can be reliably counted upon to support profiteering from child sexual abuse in any form.<p>&quot;Something you don&#x27;t like&quot; as a description for <i>the deliberate sexualization of children for profit</i>, as if it&#x27;s not an objective moral harm, is telling on yourself here. Just because the loudest leaders in Silicon Valley have been trying to convince every one of their sycophants that sexually abusing kids is no big deal doesn&#x27;t mean the rest of us who are normal have bought into it.
      • charcircuit24 minutes ago
        Reducing safety filters doesn&#x27;t mean that it&#x27;s explicitly designed for child pornography. This is like thinking free speech is designed for child pornography.
        • happytoexplain20 minutes ago
          You seem to be leaning heavily on analogy, which is inherently flawed. The entire point of analogy is that you are comparing two <i>different</i> things without actually comparing them - just declaring them equal. It&#x27;s a weak rhetorical tool for petty arguments.