25 comments

  • 6thbit23 minutes ago
    Thought it was clickbait&#x2F;circumstantial but they are quoting an actual spokesperson saying they are doing it on purpose !!<p>&gt; &quot;We&#x27;re actively defending ourselves against these lawsuits and are removing ads that attempt to recruit plaintiffs for them,&quot; a Meta spokesperson tells Axios.
  • bilekas6 hours ago
    &gt; &quot;We will not allow trial lawyers to profit from our platforms while simultaneously claiming they are harmful.&quot;<p>Wow.. That is quite a statement. Am I right in saying that in order to claim for the class action lawsuit, which facebook has been &#x27;found negligent&#x27;, that the victims need to take an action collectively in order to claim ? IE They need to be reached somehow to inform them of the possibility ?<p>Seems the most obvious place to advertise would be Meta.<p>I understand Meta can basically do whatever they like with their ToS but the statement from the Meta spokesperson seems like an extremely bad idea.
    • pixl975 hours ago
      Tobacco lawyers &quot;Putting that cigarettes are harmful on the box would be devastating to our profits!&quot;
      • akersten5 hours ago
        It would be a better analogy if tobacco companies sold ad space on their packs and chose not to do business with a private for-profit anti-smoking solicitation group.
        • adi_kurian4 hours ago
          No it would not. Meta is an advertising company that sells ad space. More specifically, Meta is the dominant firm in the social advertising market which is an oligopoly.<p>It is &quot;the business&quot;, not an imagined side revenue stream.
        • gowld2 hours ago
          And that would be a blatant admission of guilt.
      • reactordev5 hours ago
        Literally every ceo
        • deaux5 hours ago
          You missed an adjective: literally every <i>megacorp</i> CEO. Plenty of small companies with transparent and honest CEOs.<p>Also why we need much less megacorps than there are now.
      • roysting4 hours ago
        I understand the impulse, but there are not only significant differences, i.e., the requirement to add labeling to cigarettes was mostly a judicial or legislative action, but there is also that rather perverse fact that this kind of legislation that people are championing is often funded by profit and greed just like the harm being sued over.<p>The article even at least mentions that at least one of the suits is private equity funded; which generally will result in the partners and&#x2F;or investors of the private equity firm and the attorneys suing, which are often all one and the same in what is just a financial and legal shell game, net tens of millions of dollars, while the supposed victims will end up with nothing but pennies on the dollar of harm and injury.<p>I get the impulse to also “cheer” for the lawsuits, but if you thought Meta, etc. are bad; you really don’t want to look into the vile pestilence that is the law firms that are basically organized crime too by the core definition of crime being an offense and harm upon society.<p>I don’t really know a solution for this problem because it is so rooted in the core foundation of this rotten system we still call America for some reason, but for the time being I guess, the only moderately effective remedy for harm and injury is to combat it with more harm and injury.
      • _doctor_love1 hour ago
        &quot;But Black Dynamite! I sell drugs to the community!&quot;
      • bko5 hours ago
        Imagine NYT banning an ad in it&#x27;s newspaper telling people how to cancel and sue NYT?<p>Wild stuff
    • giancarlostoro6 hours ago
      Would be really entertaining if all the lawyers affected banded together and made a class action lawsuit full of lawyers as the plaintiffs.
    • stronglikedan5 hours ago
      &gt; the statement from the Meta spokesperson seems like an extremely bad idea.<p>All corporate CYA ideas sound that way, but ultimately end up benefiting the company in the end. Meta is right to do this. That&#x27;s not to say it&#x27;s right to do, but it&#x27;s right for the company.
    • HumblyTossed6 hours ago
      The judge should have ordered Meta to place a banner on FB so that everyone can see it and join if they&#x27;re a victim.
      • shimman5 hours ago
        Wow this is a really good idea. I wonder if the various state trials happening as well should use this for remediation too.<p>It&#x27;s not a hard thing to implement on their end and should be mandated by a judge as you said.<p>Filing this away for later use.
        • miki1232114 hours ago
          Europe (Poland) loves this kind of stuff.<p>It often comes up in (anti) free-speech trials, where the government compels the perpetrator to issue a public apology to the victim. Forcing them to buy an ad in a newspaper for example is not unheard of.<p>As far as I understand, Americans consider this to be &quot;compelled speech&quot; and hence prohibited, but I might be wrong on this.
          • dcrazy4 hours ago
            The same thing happens here. Courts are allowed to compel speech as a method of remedy, but my recollection is that this is sometimes successfully challenged.<p>An interesting variant I’ve seen on anti-smoking banners at convenience stores is “A federal court has ordered a Philip Morris USA to say: …”
      • smsm425 hours ago
        Not likely to survive 1st Amendment challenge - it is possible to compel somebody to certain speech as a result of losing a case, but doing this as a prerequisite when the case has just started is not likely to fly. Otherwise I could force Facebook (or any other platform) to publish anything just by suing them - and anybody could sue anybody else on virtually any grounds.
        • gowld2 hours ago
          <a href="https:&#x2F;&#x2F;about.fb.com&#x2F;news&#x2F;2025&#x2F;01&#x2F;meta-more-speech-fewer-mistakes&#x2F;" rel="nofollow">https:&#x2F;&#x2F;about.fb.com&#x2F;news&#x2F;2025&#x2F;01&#x2F;meta-more-speech-fewer-mis...</a><p>&quot;We will allow more speech by lifting restrictions on some topics that are part of mainstream discourse and focusing our enforcement on illegal and high-severity violations.&quot;
    • 3form6 hours ago
      &quot;Lawyer benefitting from cases about prostitution equals to a pimp&quot; kind of argument.
    • bwestergard6 hours ago
      They wouldn&#x27;t profit if the cases didn&#x27;t have merit.
    • draw_down6 hours ago
      [dead]
    • mchusma6 hours ago
      [flagged]
      • malfist5 hours ago
        How do you know that? How could you know that?<p>These people are one of the few people holding Meta accountable for their evil acts and because of that you call them &quot;scummiest people in the US&quot;<p>That&#x27;s nonsense.
        • which5 hours ago
          If you read the settlements that come out of these lawsuits, you will pretty much always find an 8 to low 9 figure settlement (that the lawyers get a third of), maybe some superficial policy changes, and $12 checks to the supposed victims who only became victims when they randomly got an email telling them they should join the lawsuit. The only people who benefit are the lawyers.
          • malfist5 hours ago
            $12 dollars is $12 dollars people wouldn&#x27;t have without them. You can always opt out of a class action settlement and sue yourself if you&#x27;re not happy with the terms.<p>But at the end of the day, the lawyers did real work, took on real risk and achieved something. They held a big tech company accountable, and that is a meaningful difference from the status quo. I don&#x27;t care that they made money doing that, they should.
            • amethyst4 hours ago
              Is it really holding them &quot;accountable&quot; when the settlements are for laughably small amounts, like &lt;1% of a single day&#x27;s profit?
              • malfist3 hours ago
                No one else is doing anything. So it&#x27;s something.
          • reaperducer5 hours ago
            <i>The only people who benefit are the lawyers.</i><p>My special savings account where I deposit the settlement checks from the various tech companies that have violated my privacy or other rights disagrees.<p>Sometimes it&#x27;s 43¢. Sometimes it&#x27;s $400.<p>In the last three years, I&#x27;ve put… checking… $5,351.83 in that account because tech companies think laws and morals don&#x27;t apply to them.<p>Saying that these lawsuits only benefit lawyers is both false and yet another lazy tech bubble cliche.<p>Yes, the lawyers get way more than I do. They also did 99% the work, so I don&#x27;t hold it against them.<p>Just read the newspaper. Every time you see an article about one of these suits, check it out to see if it applies to you.
            • nslsm5 hours ago
              Hey at least you get to pocket all of that. Here in Europe the government keeps the money and then distributes it to the scum of the Earth. I&#x27;d rather give the money to lawyers, at least they did _something_.
              • duskdozer5 hours ago
                &gt;distributes it to the scum of the Earth<p>Who?
        • raw_anon_11115 hours ago
          And the lawyers will make millions and the people will make nothing. Facebook won’t make any significant revenue affecting changes.
          • bdangubic5 hours ago
            that is a problem of the system that needs to be solved. meta is among the purest forms of evil inflicting irreversible effects on our society (and youth in particular) and the fact that you are quite right isn&#x27;t really an issue with the lawyers but system that allows punishment to not fit the crime.
        • dec0dedab0de5 hours ago
          There are many lawyers that gather up victims for class action payouts and take most of the money for themselves.<p>They don&#x27;t even bother trying to get more when they can, because they&#x27;re just bottom feeding.
      • reaperducer5 hours ago
        <i>You may think Meta is bad. But plaintiff counsel like this are generally the scummiest people in the US. (Maybe not universal, but 90% are morally repugnant).</i><p>As they say, &quot;95% of lawyers give the remaining 5% a bad name.&quot;<p>At the same time, 99% of social networks give the remaining 1% a bad name.
    • boringg6 hours ago
      I mean those class action lawsuits enrich trial lawyers and maybe force companies to behave better (though i bet empirical evidence would show that its more a cost of business).<p>The 20$ dollars people get is nothing but a guise that the trial lawyers are helping people.
      • bilekas6 hours ago
        I&#x27;m not sure if the lower price means that class actions shouldn&#x27;t still be taken.<p>It&#x27;s to allow companies to not have to deal with individual claims for each person. I see that the ranges can be substantial though, several thousands, but seems to be criteria.<p>&gt; Nearly nine months later, Mark received a notification that his claim had been approved. Two weeks after that, $186 was deposited into his bank account. While the amount wasn’t substantial, it covered a grocery run and a phone bill—and more importantly, it reminded him that companies can be held accountable, even in small ways. [0]<p>[0] <a href="https:&#x2F;&#x2F;peopleforlaw.com&#x2F;blog&#x2F;how-much-do-people-typically-get-in-a-class-action-lawsuit&#x2F;" rel="nofollow">https:&#x2F;&#x2F;peopleforlaw.com&#x2F;blog&#x2F;how-much-do-people-typically-g...</a><p>If the fine&#x27;s don&#x27;t dissuade companies from bad practices, the class actions with theoreticaly no upper limit might be a better option to enforce proper behaviour.
        • boringg4 hours ago
          I can agree with that -- however the amount of money the trial lawyers make comparatively is wildly disproportionate. I think that 186$ figure is an example on the high side of payouts to individuals.
  • Xeoncross5 hours ago
    As an aside, class-action lawsuits seem less than ideal for the public. The awards benefit the lawyers and perhaps a small handful, but the actual plaintiffs only get $0.05. In addition, successful class-action suits prevent further litigation from being allowed for the same issue.<p>Individuals bringing their own lawsuits seems like it would affect better change as 1) the award money would be better distributed instead of concentrated and 2) the amounts levied against the companies would be higher and more of concern than the class-action slap-on-the-wrist they currently get.
    • rurp4 hours ago
      How does this address the most common case where many people were harmed a modest amount? Causing $100 of harm to a million people is a huge amount of damage that should be punished, but nobody is going to launch a full independent lawsuit for $100.
    • bityard4 hours ago
      &gt; successful class-action suits prevent further litigation from being allowed for the same issue.<p>Only if you don&#x27;t opt out. Individuals who opt out of being part of the class can still file their own suits. (Although it&#x27;s not clear how successful you will be if your situation&#x2F;harm is not substantially different from the other members of the class.)
    • gradientsrneat1 hour ago
      I&#x27;ve opted in&#x2F;not opted out to several class actions, and without saying the exact number, I&#x27;ll say it was a lot more than that. Tech companies wouldn&#x27;t be putting binding arbitration clauses&#x2F;class action waivers&#x2F;etc in their TOS if they weren&#x27;t scared of being held accountable.
    • rokkamokka5 hours ago
      A hundred million identical court cases might not be too good for the legal system
      • ed3124 hours ago
        1. Why should harming a million people identically reduce their right to a fair legal evaluation and possibly compensation for damages? &lt;-- maybe it makes sense for large corporations to carry insurance to pay for the potentially massive legal costs they could impose on governments? 2. Shouldn&#x27;t we be able to quickly resolve these cases assuming there are no substantially different pieces of evidence?
        • CrazyStat4 hours ago
          &gt; 1. Why should harming a million people identically reduce their right to a fair legal evaluation and possibly compensation for damages?<p>It doesn’t. You can almost[1] always opt out of class action lawsuits to pursue your own suit. This would be expensive and unwise for most people, but you have right.<p>[1] There are rare exceptions.
      • wongarsu4 hours ago
        Isn&#x27;t that trivially fixed by raising court costs (that should go to whoever loses the suit) to cover the cost of judges, jury, admin expenses etc? I don&#x27;t get the impression that this would make the justice system that much more prohibitively expensive than it already is, and would allow the legal system to scale to the case load
        • ethanrutherford0 minutes ago
          No, because the limitation is not money. More money does not magically make the humans in the profession be able to handle higher case-loads, nor magically produce new lawyers and judges. The bottleneck is time that each case takes to be properly and thoroughly adjudicated, and neither &quot;more money&quot; nor &quot;more people&quot; can accelerate that. While it&#x27;s certainly correct to say that more staff could handle a larger number of cases, a. more staff = more cases, but more money doesn&#x27;t speed up those cases, so there&#x27;s still not really anything to be gained in terms of efficiency by increasing individual case costs. And b. if the solution was as simple as &quot;hire more judges&quot;, it would have happened already.<p>Courts aren&#x27;t lacking in budget to hire more people. They&#x27;re lacking in <i>people</i> available to hire, with the specific expertise that they need to fill any gaps. The legal profession, at least in the US, consistently has some of the lowest unemployment rates across the board. Unlike over here in the tech sector, the scarcity is in available talent, rather than available jobs.
      • SecretDreams4 hours ago
        Agreed. Naturally, the solution is to get meta to compensate for the actual and cumulative damage they&#x27;ve done to mankind. Then plaintiffs might actually benefit.<p>This is humanity vs Mark Zuckerberg.
    • doctorpangloss4 hours ago
      okay, what if the plaintiffs got &quot;$50,000&quot;? then to you, are class actions ideal for the public?<p>the flaw with class actions is not that they don&#x27;t pay enough (or too much, to the wrong people) money. it&#x27;s that they&#x27;re reactive, which is to say, it&#x27;s the same tradeoff with nearly all US commercial policy.
  • arendtio2 hours ago
    I love it, because it shows that advertisement is communication as well.<p>Communication is highly regulated for good reasons, and advertisement is not. This is as if telecommunication companies would disconnect calls when what is being said does not fit their agenda.<p>This should be illegal for advertising companies as well.
    • finghin2 hours ago
      I rarely say this, but very fitting username.
  • elAhmo3 hours ago
    We can effectively trace all of the problems we have today in a global scale back to social media.
    • mrweasel1 hour ago
      Assuming that we&#x27;ll come to our senses, I think well be looking back at social media, in it&#x27;s current form, the same way we now look at the Victorians using opium as cough medicine. It works, but holy shit are you doing it wrong.
    • lurk245 minutes ago
      Inane Reddit comment.
    • nutjob228 minutes ago
      I&#x27;d say &#x27;social problems&#x27;, but otherwise entirely true.<p>One of the strangest things I find is that people will spend serious money to buy a house somewhere they don&#x27;t have to be exposed to the great unwashed yet will happily expose their brains to the worst of the worst via social media.
    • ElijahLynn2 hours ago
      I&#x27;d say the root is circadian rhythm disruption. Artificial lighting, social media, etc.
    • Legend24402 hours ago
      <i>All</i> the problems? Really?
      • engeljohnb2 hours ago
        The words &quot;scale back to&quot; are vague, but I&#x27;m struggling to think of any current global problems that weren&#x27;t at least exacerbated by social media.
        • Legend24408 minutes ago
          Then you need to think a little harder.<p>You have one thing on your mind - &quot;social media bad&quot; - and it is poisoning your ability to see the complexity in the world. There is rarely a single cause for anything, and there is never a single cause for everything.
        • nickff1 hour ago
          &quot;[G]lobal&quot; is doing a lot of work in this sentence if I&#x27;m reading it as intended; this seems to exclude international conflict and intra-national strife (which are very big issues).
    • NERD_ALERT2 hours ago
      I don’t disagree that social media has played a massive role in changing the world in a negative way. This is a very far reaching claim though and one that kinda misses the forest for the trees. The problem is that fundamentally capitalism demands that companies find more ways to siphon more money from customers every quarter or they fail.<p>Social media is a perfect storm for the elites in this system. It’s a CIA wet dream. It’s literally a globalized and hyper personalized propaganda distribution platform. This is the inevitable outcome of capitalism and human behavior. Meta’s whole purpose is to create the most optimized pipeline for accepting money from 3rd parties in exchange for convincing as many people as possible of what they want those people to believe.<p>Social media is evil but it’s also the natural course of what happens with current technology and the incentives of capitalism.
      • hkpack2 hours ago
        I don’t know why it’s CIA wet dream, while it’s mostly used against western democracies.<p>Are people in CIA incompetent?
      • themafia1 hour ago
        &gt; fundamentally capitalism demands<p>Wall Street makes those demands. Those demands are backed up by court cases and precedent. Nothing about this is synonymous with &quot;capitalism.&quot;<p>&gt; It’s a CIA wet dream.<p>And they spend a significant amount of money. Is this &quot;capitalism&quot; still? Or are there more specific terms that would apply more directly to this arrangement?<p>&gt; Social media is evil<p>The US is the largest manufacturer and seller of weapons in the world.
    • AmericanOP2 hours ago
      So thank the ~80,000 employees at Facebook working tirelessly to make the platform as shoddy as possible.
    • TiredOfLife2 hours ago
      Exactly. Without social media there would have been no nazis.
      • petre2 hours ago
        Bierhalle, the social media of the 20s, to only without the personal data hoarding.
    • bryan_w1 hour ago
      I&#x27;m pretty sure you&#x27;re thinking of tuberculosis, not social media
  • crazygringo1 hour ago
    I mean, I don&#x27;t like Meta <i>at all</i>, but what do you expect? If you want to run a full-page in the New York Times that criticizes the New York Times, they&#x27;re going to refuse to run it as well. Private companies generally don&#x27;t publish things that run counter to their interests.<p>It would certainly be interesting if we wanted legislation to force private companies who provide paid ad space to publish ads that paid the most regardless of the content, but then that opens up a whole other can of worms. What if the ad offering the most money is racist and horrible, or disgustingly obscene? At that point you start needing the government to decide what is allowed to be banned and what isn&#x27;t, and then it&#x27;s meddling in speech which is prohibited by the first amendment.<p>So this just seems like an obvious non-story to me. <i>Of course</i> Meta is removing these ads, because pretty much <i>any</i> advertising platform would do the same about ads that criticized it.
    • indymike39 minutes ago
      &gt; New York Times that criticizes the New York Times,<p>This has happened.<p>&gt; the government to decide what is allowed to be banned and what isn&#x27;t,<p>This is a civil lawsuit where people are trying to a) prove they were harmed and b) be compensated for that harm. The government is just the referee.<p>&gt; Meta is removing these ads, because pretty much any advertising platform would do the same about ads that criticized it.<p>Which was not very smart because the next step will be a court order requiring them to put a banner on ever page with a link to sign up to join the lawsuit - for free.
      • crazygringo22 minutes ago
        &gt; <i>This has happened.</i><p>Source?
    • xuhu39 minutes ago
      Online stores don&#x27;t remove 1-star reviews of their products from their store. Do they ?
      • alex4357837 minutes ago
        Yes, they very frequently do.
    • xorcist36 minutes ago
      This may sound reasonable but isn&#x27;t at all how newspapers are run. You can absolutely buy an ad in the New York Times criticizing the New York Times. Within reason of course, as you said the are private entities allowed to take on any customers they want, but in general newspapers hold journalistic integrity as an ideal and will allow most things as long as they aren&#x27;t defamatory, unethical or downright illegal.<p>The ad sellers and the journalists are normally separate and will not interfer much with one another&#x27;s work. It also helps that they never say no to money. I don&#x27;t know about the New York Times specifically but similar things have happened many times in other newspapers, and there is such a thing as a paid editorial. Those are usually clearly marked as such, but it&#x27;s basically the same thing.<p>(However, there may be other reasons why you might want to go with a competitor instead, and not pay the newspaper you hate $100k.)
  • jerf1 hour ago
    At the risk of going against the gestalt, Facebook openly and publicly rejecting the ads is actually one of the better outcomes. They could have just put their thumbs on the scale, deprioritizing them, serving them to people they think are least likely to bite, etc. Lying about the number of times it was served because, after all, who can check? Many of us suspect the ad platforms already do this pretty routinely through one mechanism or another anyhow, after all.<p>It isn&#x27;t reasonable to ask a platform to host content that is literally about suing them, not because of &quot;freedom&quot; concerns or whether or not Facebook is being hypocritical, but more because in the end there isn&#x27;t a &quot;fair&quot; way for them to host that. The constraints people want to put on how Facebook would handle that ends up solving down to the null set by the time we account for them all. Open, public rejection is actually a fairly reasonable response and means the lawyers at least know what is up and can respond to a clear stimulus.
    • lurk234 minutes ago
      &gt; It isn&#x27;t reasonable to ask a platform to host content that is literally about suing them<p>Explicit rejection is better than opacity but better still is public accountability. Meta’s properties have a combined userbase that amounts to just over 1 in 4 people on earth; these platforms should have been regulated as utilities a long time ago. Suppose I wanted to run ad campaigns advocating for antitrust legislation targeting social media companies and ended up getting booted off of all of the major platforms; what feasible method is there for me to advance these ideas that could possibly compete with the platforms’ own abilities to influence public opinion?
  • teunispeters29 minutes ago
    I mean it probably shows that &quot;common carrier&quot; protection should emphatically not apply to them, but what do I know?
  • bcjdjsndon4 hours ago
    Hang on a minute, meta apparently didn&#x27;t have the time to be checking the content of adverts they get paid to serve when it was child porn, what&#x27;s changed all of a sudden?
    • henry20234 hours ago
      The crypto-“investing” deep fakes impersonating recognizable names are up and running too.
    • mekdoonggi4 hours ago
      This one actually cost them money.
    • heresie-dabord4 hours ago
      Excellent point. Suddenly Corporatron finds it easy to censor content in its product.<p>But why must we limit ourselves to simplistic, false dichotomies such as &quot;Good vs Evil&quot;, &quot;Education vs Ignorance&quot;, &quot;Community Well-Being vs Disinformation and Arrant Nonsense&quot;, &quot;Democracy and Social Confidence vs Propaganda and Conspiratorial Mayhem&quot;, and &quot;Mental Health vs Despair and Self-Harm&quot; ? We really are focused on <i>building apps that people love</i>.
  • bastard_op5 hours ago
    I wonder what would happen posting these ads to truth social and twitter.
  • fdeage4 hours ago
    &quot;Anxiety. Depression. Withdrawal. Self-harm. These aren&#x27;t just teenage phases — they&#x27;re symptoms linked to social media addiction in children.&quot;<p>Seems like they couldn&#x27;t write even three lines without a LLM.
    • WesolyKubeczek4 hours ago
      LLMs love this style, but they love it because it&#x27;s just about every single piece of advertisement writing for the last aeon or so, and it&#x27;s a mighty chunk of their training corpora.
    • boelboel4 hours ago
      Maybe being unable to write us another symptom
  • shevy-java3 hours ago
    I think it is time to disband Facebook. Ever since they attempted to infiltrate the linux ecosystem via age sniffing, they really need to go. Corporate systemd can also go - we should really clean up the whole ecosystem. What ever happened to &quot;privacy first?
  • varispeed4 hours ago
    I wonder when they&#x27;ll tackle literal porn showing up in Instagram shorts. If you want to browse Instagram in public, forget it.
  • guywithahat5 hours ago
    There is a humor that these law firms won a case against Meta and the first thing they did is give them advertising money won from the court case. That said the ads sound pretty aggressive, and from what I&#x27;ve read it sounds like it wasn&#x27;t a very fair decision. I understand the conflict of interest but I have sympathies for Meta here
    • mekdoonggi3 hours ago
      I think these ads were to bait Meta into banning them (which they&#x27;ve done) and now the firm will follow up with a lawsuit because Meta suddenly is able to very rapidly decide what ads get shown if it&#x27;s going to hurt their bottom line.<p>I do not have any sympathy for Meta.
  • pcardoso6 hours ago
    Reminds me of Carl Sagan’s Contact, where Haden, the millionaire funding Ellie’s work, made a TV ad blocker and then sued the TV companies when they refused to play ads for his product.<p>I wonder if that is what will happen next.
  • mrwh6 hours ago
    Meta wants to be an impartial platform only and exactly when it suits them to be.
    • rurp4 hours ago
      Yeah, glad to see Zuck is sticking with those strong free speech principles he couldn&#x27;t wait to get back to last year.
      • alex11384 hours ago
        Free speech which apparently includes <a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=42651178">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=42651178</a><p>I actually am more at odds with HN than many people might be because I think the lies surrounding covid and the censorship were absolutely wrong and platforms could genuinely after things like that lay claim to being unfairly directed, but you can tell Zuck doesn&#x27;t actually care because he immediately started doing that
    • PaulHoule4 hours ago
      Wow.<p>Does Zuckerberg have some kind of clinical condition where he just can&#x27;t imagine how other people might see him?<p>Sure this will slow down the personal injury lawyers finding clients but it won&#x27;t stop them, meantime it is more ammunition for Facebook&#x27;s enemies to use against it.<p>It is one thing to do shady business, it is another thing to incriminate yourself. If you were involved with weed and somebody sent you an email asking if they could come around and pick up a Q.P. next Saturday I&#x27;d expect you to give the person a correction in person that they shouldn&#x27;t do that again.<p>Not to say you should be like Epstein but I mean he and the people he corresponded with had some sense so there is is very little evidence of criminal activity in millions of emails.<p>At Facebook on the other hand all the time people sent emails about things that could just as easily been left as &quot;dark matter&quot; unexplained and minimally documented decisions but no it is like that M.F. Doom song &quot;Rapp Snitch Knishes&quot;, like a bunch of children or something with no common sense at all.
      • matheusmoreira3 hours ago
        &gt; Does Zuckerberg have some kind of clinical condition where he just can&#x27;t imagine how other people might see him?<p>Yeah, it&#x27;s called having-too-much-money-to-careitis.
      • Lord_Zero4 hours ago
        Damn whos buying a Q.P.??
      • jjtheblunt4 hours ago
        language failure on my end: what&#x27;s a &quot;Q.P.&quot;?
      • hn_acker3 hours ago
        &gt; Does Zuckerberg have some kind of clinical condition where he just can&#x27;t imagine how other people might see him?<p>Cory Doctorow describes Mark Zuckerberg&#x27;s and Elon Musk&#x27;s attitude toward other people as billionaire solipsism [1].<p>[1] <a href="https:&#x2F;&#x2F;pluralistic.net&#x2F;2026&#x2F;01&#x2F;05&#x2F;fisher-price-steering-wheel&#x2F;" rel="nofollow">https:&#x2F;&#x2F;pluralistic.net&#x2F;2026&#x2F;01&#x2F;05&#x2F;fisher-price-steering-whe...</a>
        • nielsbot2 hours ago
          Good writing. thanks for sharing.
      • NickC254 hours ago
        &gt;Does Zuckerberg have some kind of clinical condition where he just can&#x27;t imagine how other people might see him?<p>Not sure he cares. He&#x27;s literally got hundreds of billions of dollars to his name, and the corporation he founded is worth trillions.
      • glitchc4 hours ago
        When you have f.u. money, you get to say f.u., otherwise what&#x27;s the point?
      • jmye4 hours ago
        &gt; Does Zuckerberg have some kind of clinical condition where he just can&#x27;t imagine how other people might see him?<p>Nah, he just doesn&#x27;t care. <i>Nothing</i> he does will ever get people (en masse, onesie, twosies don&#x27;t matter) to stop using Meta products.<p>People can&#x2F;will complain about him forever, but shitty people will continue to help him build things, and shitty people will continue to use them.
        • PaulHoule4 hours ago
          Well Meta products are decaying like Cobalt 60; people <i>are</i> stopping<p><a href="https:&#x2F;&#x2F;trends.google.com&#x2F;explore?q=facebook&amp;date=all&amp;geo=US" rel="nofollow">https:&#x2F;&#x2F;trends.google.com&#x2F;explore?q=facebook&amp;date=all&amp;geo=US</a><p>and maybe Zuck doesn&#x27;t think he can do anything about it. There are different theories but i like this one:<p>-- originally you would put some imagination and elbow grease into using Facebook and get some intention which made it very attractive and interesting to people around 2010<p>-- then it found a business model which was dependent on your <i>not</i> being able to use imagination and elbow grease to get attention which made it less interesting in general but still somewhat interesting because now you could put cash into the slot machine and get cash out<p>-- over time they lowered the payout of the slot machine which made the game less interesting and more dependent on 100% profitable scams which could function no matter how bad the payout was; people lose trust in the platform and stop engaging with ads, real advertisers don&#x27;t want to be seen next to scam ads (lest they be seen as scams) which further lowers the payout and makes the game less interesting over time<p>-- and now they won&#x27;t even take your money... so who cares?
          • felixg33 hours ago
            Half-ing every 5.2-or-so-years?
            • PaulHoule2 hours ago
              Within that order of magnitude. Picked Co60 particularly because it is common in commerce, even I used Co60 sources in my cond-mat theory trainee days.<p>I think it is more like radioactive decay than say, cheese going bad, but maybe I&#x27;m wrong. You can&#x27;t smell the radioactive decay!
        • alex11384 hours ago
          They both captured the early market (inconsistent page style of Myspace, slowness of Friendster, then they acquired Friendfeed) in an early internet - anyone who captures the early market will have THE network effect for decades (plus shadow profiles) as person x joins because person y is there because person z is there - which is still young to this day, and also they apparently used to censor links to their competition<p>The game is rigged, also Instagram and Whatsapp (yeah, companies get acquired. but WA&#x27;s Acton was very explicit - &quot;delete Facebook&quot; (also, ever tried deleting FB? almost impossible. more network effects). he was pissed off at what happened)
          • PaulHoule4 hours ago
            ... the other theory of Facebook&#x27;s decline is that Eternal September gets you every time. I mean in the 1970s the CBGB and Mudd Club were really cool and they folded up and the scene moved on.<p>Once I started using the social features on my MQ3 I found it really was Zuck&#x27;s worst nightmare. I met all these nice retirees who were fun to play <i>Beat Saber</i> with and who would go on cruises and post YouTube links to pano videos they take of the ship.
      • ModernMech4 hours ago
        Yes, it&#x27;s called being a billionaire. I&#x27;m sure if clinicians actually studied this group of people, they would find strains of delusions of grandeur, paranoia, extreme risk taking behavior, lack of self control and self awareness, inability to deal with adversity and setbacks without emotional outbursts, inability to contain and dismiss intrusive antisocial thoughts.<p>I feel probably that the emotional maturity of most billionaires is at the toddler level or below, and I mean that quite seriously and literally.
        • PaulHoule3 hours ago
          <a href="https:&#x2F;&#x2F;tvtropes.org&#x2F;pmwiki&#x2F;pmwiki.php&#x2F;Main&#x2F;AcquiredSituationalNarcissism" rel="nofollow">https:&#x2F;&#x2F;tvtropes.org&#x2F;pmwiki&#x2F;pmwiki.php&#x2F;Main&#x2F;AcquiredSituatio...</a><p>?
    • tiberius_p6 hours ago
      That&#x27;s exactly what they&#x27;re saying.
    • Lihh273 hours ago
      one tos clause and neutrality disappears. now meta decides which claims get reach
    • stronglikedan5 hours ago
      Name one platform that doesn&#x27;t, and I&#x27;m not just talking about lip service.
      • mbesto4 hours ago
        Signal?
      • pocksuppet2 hours ago
        Hacker Ne.... no wait, not that one either.
      • bloqs4 hours ago
        It&#x27;s not unilateral but if it is a commercial interest, then I&#x27;ll agree that it usually is
      • _moof4 hours ago
        There are degrees.
      • RobotToaster4 hours ago
        4chan?
      • latexr4 hours ago
        Not an excuse. We shouldn’t turn a blind eye to bad behaviour because “everyone does it”.<p><a href="https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Whataboutism" rel="nofollow">https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Whataboutism</a>
    • kotaKat6 hours ago
      I mean, they spun up a bullshit &quot;Oversight Board&quot; that they can fully 100% choose to ignore and decline to implement their demands when they&#x27;re made.
    • analog83743 hours ago
      Reddit is the same way. Poke a few sacred cows and suddenly you&#x27;re banned for something you did 6 months ago that we aren&#x27;t going to tell you about and no we don&#x27;t want to discuss it.<p>Kafkaism is natural and organic.
    • 2OEH8eoCRo06 hours ago
      [flagged]
      • LocalH4 hours ago
        Throwing the baby out with the bathwater?<p>I believe we need to <i>strengthen</i> 230, but with the added caveat that affected platform owners must stop gaming the algorithms, that it must require <i>user-driven</i> curation. Let me curate my own feed, stop shoving shit in front of my eyes. When you do so, you&#x27;re making heavy editorial decisions, and should be open to liability.
        • deeponey4 hours ago
          This is really the essence of it. Section 230 is critical to a healthy internet, but there is large grey area between editorial and platform. Places like youtube, meta, X, etc. are pretending to be platforms when really they are algorithmic editors, gatekeepers, and curators. They are much more like traditional media newspapers than say your ISP, and they need to be treated as such.
          • freejazz2 hours ago
            What about the internet today is healthy such that anyone could point to Section 230 as the reason why?
            • LocalH2 hours ago
              The internet is unhealthy today <i>specifically because</i> the law elevates platform editorializing to the same level as individual freedom of speech.
              • freejazz2 hours ago
                I agree, but I&#x27;m not sure the person I&#x27;m responding to would. I cannot imagine how anyone could describe today&#x27;s internet as healthy.
      • kelnos4 hours ago
        That phrase does not mean what you think it means.
      • epistasis4 hours ago
        A few years ago this seemed a bit too extreme for me. Now, with the web mostly burned down anyway, I see little to lose and lots to gain in a section 230 repeal. My, how the Overton Window changes on some ideas. And when it&#x27;s changing on some things it tends to accelerate on others too, like a social momentum on reconsidering past norms.
        • Pxtl4 hours ago
          My compromise pitch, since the &quot;You need ID from your users&quot; ship has sailed:<p>Companies are not liable if they have proper ID of the person who submitted the content and can provide that to a plaintiff. If they have not made a good-faith effort to know who submitted this info (like taking ID, not just an email address) then they&#x27;re taking responsibility for the submitted content.<p>Which means sites that have responsible moderation can still allow anonymous contributions.<p>The real problem is the inherent asymmetry of legal battles, where the wealthiest can fight forever with endless motions and have near-total impunity while a legal action would basically nuke a normal person&#x27;s life. Not to mention the fact that an international border can often make this whole conversation moot.
          • epistasis4 hours ago
            &gt; Which means sites that have responsible moderation can still allow anonymous contributions.<p>Anonymous contributions, up to the point of somebody compromising the service? With the quantity of password hash thefts, I suspect we&#x27;ll see even more ID thefts this way.<p>I can&#x27;t imagine using any service that asks for ID, except perhaps from the well-established giants, so an exception for identifiability would effectively be a gigantic moat granted to the largest internet companies to keep out competition. Anything like that would need to be paired with massive anti-trust changes, as well as perhaps government take-over of the giants as utilities, none of which sounds very appealing...<p>That said, don&#x27;t take any of my rambling as discouragement, your type of thinking is exactly what we need, we need massive amounts of policy discussion and your suggestion is very innovative.
          • pocksuppet2 hours ago
            That&#x27;s basically how things used to work in Germany. It used to be that if someone torrented movies on your internet connection, you were fined. No ifs, no buts, they monitored 100% of the public torrents and courts agreed with 100% of the fines. And they didn&#x27;t care who did it - if they didn&#x27;t know (which is almost always true) they fined the owner of the internet connection. It was a really really bad law. For 10-15 years after every other country had public wifi hotspots, Germany didn&#x27;t because the owner would get fined for every torrent. After a very long time, they eventually passed a law saying public wifi operators didn&#x27;t have to pay.
          • 2OEH8eoCRo04 hours ago
            I like this compromise.<p>One of my issues is the lack of liability in practice. The poster is technically liable but they&#x27;re anon, behind proxies, foreign, etc. and unaccountable. It results in people being harmed online without recourse.<p>These companies should have a duty to know who their users are.
      • ndiddy4 hours ago
        The main problem with 230 is that the courts have decided to treat it as if it removes all legal liability from online platforms, rather than just publisher liability. The way the text was written seems to be intended to protect platform operators from publisher liability but still have them under distributor liability. For example, if you own a bookstore and carry a book that says something defamatory, you can be held liable if you don&#x27;t remove the book after being informed about its contents. However, a court case soon after 230 passed created the precedent that it absolves online platforms of all forms of liability. This means that if a platform knows it hosts illegal or defamatory content and doesn&#x27;t take it down, they aren&#x27;t liable and any legal cases against them will get thrown out due to 230. One of the authors of section 230 later said that &quot;the judge-made law has drifted away from the original purpose of the statute.&quot;
        • krapp4 hours ago
          &gt;For example, if you own a bookstore and carry a book that says something defamatory, you can be held liable if you don&#x27;t remove the book after being informed about its contents.<p>I don&#x27;t think you can in the US. Maybe elsewhere, but in the US AFAIK the author is responsible for the content they publish, not the bookstores carrying the books.<p>&gt;This means that if a platform knows it hosts illegal or defamatory content and doesn&#x27;t take it down, they aren&#x27;t liable and any legal cases against them will get thrown out due to 230.<p>No it doesn&#x27;t. Section 230 doesn&#x27;t allow sites to host illegal content, of course only &quot;legality&quot; within the framework of US law matters.<p>All it says is that the liability for user posted content lies with the user posting the content, not the platform hosting it. Which to me seems appropriate.
    • zeroonetwothree5 hours ago
      I think there’s a clear difference in restricting advertising vs organic posts.
      • thimabi5 hours ago
        Meta does both. It has long been said that businesses have little organic reach in Meta’s platforms, as an incentive for them to use ads.
        • lazarus014 hours ago
          I’m not a big poster at all, but ran into this precise issue.<p>They analyze the video posts on instagram. If they detect the video has even a small amount of commercial value, they classify it as branded content and you need to pay for it to get promoted.
        • alex11384 hours ago
          For all the creepy People You May Know stuff they don&#x27;t even bother connecting people properly even on people&#x27;s personal pages <a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=14147719">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=14147719</a>
      • HWR_145 hours ago
        What difference is that?
  • ginkgotree4 hours ago
    Social Media, and specifically Facebook &#x2F; Meta, will go down in history as one of the worst developments in technology in the 21st century. As Frances Haugen stated in her testimony, Mark Zuckberg needs to be removed from the helm at Meta.
    • vachina4 hours ago
      They started out good and then cranked the engagement trap to the max when they realize value of a captive audience.
    • wvenable4 hours ago
      I think television has done more harm, politically.
      • boelboel2 hours ago
        Radio did plenty of harm as well (especially post 1987). Rush Limbaugh had a peak audience of 20-30 million listeners a week in the 90s. The current state of politics might&#x27;ve been unpreventable, at least in the US.
        • MandieD1 hour ago
          Radio did plenty of harm in 1930s Germany - Hitler was a master of the new medium.
  • neilv2 hours ago
    Idea of something that undergraduate colleges could do, to encourage reflection about ethics in careers:<p>Annually poll all the students, to get rankings of how the ethics of well-known companies&#x2F;brands are perceived by the students.<p>Then publish the results to students, in a timely fashion, before they&#x27;re deciding job offers and internships.<p>I speculate that effects of this could include:<p>1. Good hiring candidates modifying what offers they pursue and accept -- influenced by awareness, self-reflection, and&#x2F;or peer-pressure.<p>2. Students thinking and talking about ethics, when they didn&#x27;t before. Then some of them carry this influence with them, as part of their character and intellect, going forward (like is one of the ideals of college education).<p>Also, maybe the second year of the poll, the sentiments are better-informed, because a lot more people have started paying more attention to the question of ethics of a company.<p>The perception breakdowns by college major would also be interesting, but maybe don&#x27;t publish those, to reduce internal incentives to game the results. (Everyone knows some majors tend a bit more towards sociopathic than others, but some would rather that not be officials.)
    • taormina1 hour ago
      They already have ethics classes in college. The unethical already don’t care. Students already do basic research but when the market is so shit, do you really expect them to permanently hamstring their careers by torpedoing their only offer? Especially given that as a fresh college grad, it’s not like anyone cares about your opinion anyways. So let’s guilt the vaguely ethical ones into never getting involved and leaving all the sociopaths to run the show. This is an idea destined for success.
  • HumblyTossed6 hours ago
    Do photogs do that on purpose, or does Zuck really always have that sociopath stare?
    • SpicyLemonZest6 hours ago
      Zuckerberg is a rich and high profile guy, so photographers capture many pictures of him, and news editors often find that choosing unflattering pictures of people their readers don&#x27;t like is helpful for reach. This picture in particular was taken after he&#x27;d just finished testifying for 8 hours in a February trial, which I think would wear down the best of us, and even among Getty&#x27;s extensive gallery of pictures taken then (<a href="https:&#x2F;&#x2F;www.gettyimages.com&#x2F;detail&#x2F;news-photo&#x2F;mark-zuckerberg-chief-executive-officer-of-meta-platforms-news-photo&#x2F;2261841504" rel="nofollow">https:&#x2F;&#x2F;www.gettyimages.com&#x2F;detail&#x2F;news-photo&#x2F;mark-zuckerber...</a>) this one is particularly unflattering IMO.
    • username2233 hours ago
      It’s less unflattering than the legless avatar from his $80 billion waste of money.<p><a href="https:&#x2F;&#x2F;cdn.mos.cms.futurecdn.net&#x2F;yUEJgQzunhbnYYtsckup7i.jpg" rel="nofollow">https:&#x2F;&#x2F;cdn.mos.cms.futurecdn.net&#x2F;yUEJgQzunhbnYYtsckup7i.jpg</a>
    • folkrav5 hours ago
      Both.
    • alex11385 hours ago
      Keep in mind Zuckerberg is someone who supports things like this <a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=10791198">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=10791198</a><p>Zuckerberg was told about gay people being added to groups and it outed them by posting to their wall, and he ignored it <a href="https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=nRYnocZFuc4" rel="nofollow">https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=nRYnocZFuc4</a><p>And obviously <a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=1692122">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=1692122</a> (guess we don&#x27;t get access to his other messages, though <a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=16770818">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=16770818</a>)<p>His stare isn&#x27;t the only thing about him that&#x27;s sociopathic<p>Edit: oh yeah and <a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=42651178">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=42651178</a>
      • gorbachev3 hours ago
        Zuck and his minions are also responsible for <a href="https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Rohingya_genocide" rel="nofollow">https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Rohingya_genocide</a><p>Your examples pale in comparison.
      • alex11385 hours ago
        Guys, there&#x27;s no need to insta-downvote. I provided substantive evidence. Look in the mirror, and evaluate who you work for
    • IncreasePosts5 hours ago
      I&#x27;m sure if people were taking 500 pictures of you, they would capture you in a state like that. Are you a sociopath?
  • josefritzishere5 hours ago
    So they remove class action lawsuits but not pedos. Got it.
    • stronglikedan5 hours ago
      Since literally everyone is calling everyone they don&#x27;t like a pedo nowadays, it&#x27;s pretty much impossible for any platform to get rid of the pedos.
    • mekdoonggi2 hours ago
      I suspect the lawyers will use this as evidence as well. Meta can very quickly remove ads when it&#x27;s going to cost them money.
  • neuroelectron6 hours ago
    Reminds me of ChatGPT insisting all news about OpenAI is unverified speculation.
  • k33n7 hours ago
    The idea that Meta is obligated to be so impartial that it must allow lawsuits against itself to be promoted on its own platform is a bit naive and utopian.<p>Its own TOS states that they won’t allow that.
    • schubidubiduba6 hours ago
      TOS are not laws. In fact, they often partially violate laws and those parts are then void. In some countries, anything written in TOS that is not &quot;expected to be there&quot; is void.
      • zeroonetwothree5 hours ago
        Ok but I don’t really see why this specific term would violate any law? Do we really want a society where platforms are forced to present speech that is harmful to them? If you own a store and I put a sign up on your wall advertising a rival store wouldn’t it be reasonable for you to disallow that?
        • quantum_magpie5 hours ago
          An alternative reply, with analogy, if you like them:<p>You own a restaurant, where you sell poisoned (intentionally and knowingly) food. A group of people band up for class action lawsuit for poisoning them, and have the lawyers post a sign at your restaurant, that everyone poisoned there should reach out and get some compensation.<p>Should you be allowed to take the sign down?
          • ronsor3 hours ago
            They shouldn&#x27;t be allowed to put the sign up unless it&#x27;s court ordered.<p>I know this answer doesn&#x27;t pass the vibes test, but it&#x27;s how the law actually works. If you post a sign on someone&#x27;s property without permission, you&#x27;ll get in trouble for trespassing, vandalism, or both.<p>So get a judge to issue an order. In a serious situation, they very well might.
        • quantum_magpie5 hours ago
          It’s not a rival store, or speech against them.<p>It’s a lawsuit, with the users of the platform as the damaged party, against the platform. Removing the possibility to reach the users should result in a default judgement with maximum damages immediately.
      • raincole6 hours ago
        No one says ToS are laws and especially not the parent commenter.
        • Fraterkes5 hours ago
          The parent comment brings up the ToS as an example of why it&#x27;s naive to believe Meta is obligated to do something, but what Meta is <i>obligated</i> to do depends on the law.
          • raincole5 hours ago
            And which laws state that Meta is obligated to show ads like this?
            • Fraterkes2 hours ago
              Irrelevant. My point is that the parent comment <i>did</i> imply that the ToS created obligations for Meta in the way that laws do, which means your first comment was incorrect.
      • mywittyname5 hours ago
        I kind of wish countries would just define, &quot;terms of service&quot; for everyone and not allow companies to modify them further.
    • nkrisc7 hours ago
      Fair enough. If they&#x27;re not impartial then lets hold them accountable for the content published in their platform.
      • k33n6 hours ago
        I’m not against these companies losing their Section 230 immunity. Social media platforms are, in my personal opinion, publishers in their current form.<p>If they went back to operating as “friends and family feed providers” then letting them keep their 230 immunity would be easier to justify.
        • TheCoelacanth5 hours ago
          Yes, if they went back to being chronological feeds of people you follow, then they should get to keep Section 230 immunity.<p>When they are making editorial decisions about what to content to promote to you and what content to hide from you, then they should lose it.
        • pocksuppet2 hours ago
          Section 230 doesn&#x27;t say anything about publishers. That was entirely made up by chronically online arguers.<p>What it does say is you aren&#x27;t liable for something someone else wrote.<p>It doesn&#x27;t create liability for things not covered by it.<p>Guess who decides the order and contents of Facebook feeds? Facebook does. So they wouldn&#x27;t be liable for someone writing a post saying &quot;gas the jews&quot; but they would still be liable for choosing to show it at the top of everyone&#x27;s front page, if that was a choice, because the front page was choice-based rather than chronological.
        • wbobeirne6 hours ago
          You are relying on the wrong people to be able to understand that nuanced distinction.
      • wnevets6 hours ago
        [flagged]
      • mc326 hours ago
        To me that’s how it should be. They shouldn’t have to run ads against themselves yet they should be liable or accountable for harm they are found guilty of.
        • pixl976 hours ago
          &gt;They shouldn’t have to run ads against themselves<p>This is not how it works when you&#x27;re found guilty of committing harm. Tobacco companies are a good example of this.
          • mc326 hours ago
            If the government mandates them then yes. If it’s not mandated they have the right to refuse service.
            • pixl975 hours ago
              The bigger you get the more iffy it gets refusing service to others. Also it can and will be used against you in future civil and criminal cases.
    • iinnPP6 hours ago
      I tend to agree with you on this. I wanted to add however that Meta itself lets so many TOS violating ads in, that it seems like special treatment for ads that are much less undesirable than the ads normally pushed.<p>It&#x27;s not just a Meta issue either.
    • hansvm6 hours ago
      Companies have to inform affected individuals of data breaches, especially when HIPAA gets involved. Brokers have to inform clients of transaction errors. Auto manufacturers have to inform owners of recalls. Retirement funds have to inform plan participants of lawsuits involving those funds.<p>You don&#x27;t even have to invoke the idea that Meta is big enough to be regulated as a public utility for this to have broad precedent in favor of forcing a malicious actor to inform its victims that they might be entitled to a small fraction of their losses in compensation.
      • zeroonetwothree5 hours ago
        Well we aren’t discussing the government requiring meta to inform users. We are discussing whether meta can choose which private actors’ ads to allow. It would seem silly that a platform would be forced to allow all ads.
        • hansvm3 hours ago
          Aha, how clever. We aren&#x27;t discussing whether they can be forced to display messaging; we&#x27;re discussing whether they&#x27;re going to later get slapped down for blocking that messaging.<p>I get that the distinction matters a bit from time to time (court cases keep blurring the line in the US though), but:<p>1. With all the other shit that makes it through the filter, this was pretty clearly a targeted, strategic takedown rather than some sort of broad &quot;we don&#x27;t allow bad ads on the platform.&quot; Allowing &quot;all ads&quot; isn&#x27;t the thing being argued; it&#x27;s allowing &quot;this ad.&quot;<p>2. The non-offensive idea of &quot;abusers shouldn&#x27;t be allowed to deceive and gaslight their victims&quot; is pretty strongly in favor of this being a bad move on Meta&#x27;s part if it was an intentional act. Maybe it shakes out fine for them legally in this particular instance, but the fact that as a society we routinely require companies and individuals to behave with more appearance of moral standing than this suggests that blocking this particular ad is over the line, and it&#x27;s neither naive nor utopianistic to think so. Even if it&#x27;s legally in the light-grey, it&#x27;s an abuse of power worth talking about, and hopefully it inspires more people to leave their platform.
    • mirashii6 hours ago
      That idea was not expressed in the article, only the fact that the ads were removed. This is worth covering, especially when coupled with the context for what ads Meta regularly does allow. One does not have to believe that they&#x27;re obligated to do so while also believing that it&#x27;s incredibly scummy behavior that consumers should be aware of and question.<p><a href="https:&#x2F;&#x2F;www.reuters.com&#x2F;investigations&#x2F;meta-is-earning-fortune-deluge-fraudulent-ads-documents-show-2025-11-06&#x2F;" rel="nofollow">https:&#x2F;&#x2F;www.reuters.com&#x2F;investigations&#x2F;meta-is-earning-fortu...</a>
    • dcrazy5 hours ago
      This is why courts are empowered to infringe upon the rights of parties to the case.
    • Zigurd6 hours ago
      There are so many ads for nostrums, cults, get rich quick scams, and other junk that violate TOS, that Meta has a legitimacy problem with their TOS.
    • freejazz6 hours ago
      Okay? They&#x27;re exactly the assholes everyone says they are. That&#x27;s the point.
    • gilrain7 hours ago
      Let’s force them to be obligated to do that, then. “Just let them hurt people, and then let them hide that hurt” kind of sucks for society.
    • 3form6 hours ago
      Maybe, but so what? Your remark lacks a conclusion.<p>Mine is that it could then well be required to do so by law. Companies are not individuals, so I don&#x27;t think they are owed any freedoms beyond what is best for utility they can provide.
    • Larrikin6 hours ago
      [flagged]
    • pixl976 hours ago
      [flagged]
    • streetfighter646 hours ago
      The idea that a company can override laws via its TOS is a bit strange.
      • BeetleB5 hours ago
        Genuinely curious. By not allowing a specific type of ad, what law are they breaking?
    • hashmap6 hours ago
      at certain scales, reality has to win out over whatever ideal you have in your head about how things should be. facebook is massive, a lot of society is on it, and its a problem to make recourse invisible to people most affected by the thing stealing their attention.
    • swiftcoder6 hours ago
      &gt; The idea that Meta is obligated to be so impartial<p>Is their defence of Section 230 protections not in part rooted in that claim of impartiality?
      • nradov6 hours ago
        No. Section 230 doesn&#x27;t mention anything about impartiality.
        • swiftcoder5 hours ago
          It indeed doesn&#x27;t, but conservative lawmakers signalled repeatedly that they were unhappy about Meta&#x27;s protection under section 230 if their moderation policies were not politically neutral
  • glaslong5 hours ago
    Thus begins another Streisand Effect meme campaign of<p>&quot;MZ Is A Punk-Ass B<p>payed for by Person &amp; Guy LLP&quot;
  • skeeter20204 hours ago
    Can&#x27;t we all just agree there are no GOOD people in this situation? Meta, class-action lawyers, PE and big money that funds the lawsuits as a profit venture... The one thing they all appear to share: parasites extracting resources from their host.