14 comments

  • firefoxd5 hours ago
    Without even looking at the AI part, I have a single question: Did anybody investigate? That&#x27;s it.<p>Whether it&#x27;s AI that flagged her, or a witness who saw her, or her IP address appeared on the logs. Did anybody bothered to ask her &quot;where were you the morning of july 10th between 3 and 4pm. But that&#x27;s not what happened, they saw the data and said &quot;we got her&quot;.<p>But this is the worst part of the story:<p>&gt; And after her ordeal, she never plans to return to the state: “I’m just glad it’s over,” she told WDAY. “I’ll never go back to North Dakota.”<p>That&#x27;s the lesson? Never go back to North Dakota. No, challenge the entire system. A few years back it was a kid accused of shoplifting [0]. Then a man dragged while his family was crying [1]. Unless we fight back, we are all guilty until cleared.<p>[0]: <a href="https:&#x2F;&#x2F;www.theregister.com&#x2F;2021&#x2F;05&#x2F;29&#x2F;apple_sis_lawsuit&#x2F;" rel="nofollow">https:&#x2F;&#x2F;www.theregister.com&#x2F;2021&#x2F;05&#x2F;29&#x2F;apple_sis_lawsuit&#x2F;</a><p>[1]: <a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=23628394">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=23628394</a>
    • rcvassallo831 hour ago
      The thing about the legal system is there&#x27;s no incentive to investigate to find the truth.<p>The incentive is to prosecte and prove the charges.<p>Speaking from the experience of being falsely accused after calling 911 to stop a drunk woman from driving.<p>The narrative they &quot;investigated&quot; was so obviously false, bodycam evidence directly contradicted multiple key facts. Officials are interested only seeking to prove the case. Thankfully the jury came to the right verdict.
      • hnuser12345625 minutes ago
        I would absolutely never call the police on a woman. Simply walk far away and let her be someone else&#x27;s problem.
    • latexr1 hour ago
      Yes, of course someone should have investigated, but the larger point here is that people don’t because they are being sold a false narrative that AI is infallible and can do anything.<p>We could sit here all day arguing “you should always validate the results”, but even on HN there are people loudly advocating that you don’t need to.
      • harshreality19 minutes ago
        Where are you seeing people being told that AI is infallible? AI is being hyped to the moon, but &quot;infallible&quot; is not one of the claims.<p>To the extent people trust AI to be infallible, it&#x27;s just laziness and rapport (AI is rarely if ever rude without prompting, nor does it criticize extensive question-asking as many humans would, it&#x27;s the quintessential <i>enabler</i>[1]) that causes people to assume that because it&#x27;s useful and helpful for so many things, it&#x27;ll be right about everything.<p>The models all have disclaimers that state the inverse. People just gradually lose sight of that.<p>[1] This might be the nature of LLMs, or it might be by design, similar to social media slop driving engagement. It&#x27;s in AI companies&#x27; interest to have people buying subscriptions to talk with AIs more. If AI goes meta and critiques the user, that&#x27;s bad for business.
        • jmalicki15 minutes ago
          &gt; Where are you seeing people being told that AI is infallible? AI is being hyped to the moon, but &quot;infallible&quot; is not one of the claims.<p>I see all kinds of people being told that AI-based AI detection software used for detecting AI in writing is infallible!<p>You want to make sure people aren&#x27;t using fallible AI? Use our AI to detect AI? What could possibly go wrong.
      • dpkirchner1 hour ago
        We can barely convince powers thar be that eye-witness testimony is unreliable, after all.
    • bl4ckneon4 hours ago
      I think you missed many important points.<p>&quot;The trauma, loss of liberty, and reputational damage cannot be easily fixed,” Lipps&#x27; lawyers told CNN in an email.<p>That sounds a LOT like a statement you make for before suing for damages, not to mention they literally say &quot;Her lawyers are exploring civil rights claims but have yet to file a lawsuit, they said.&quot;<p>This lady probably just wants to go back to normal life and get some money for the hell they put her in. She has never been on a airplane before, I doubt she is going to take on the entire system like you suggest. Easier said than done to &quot;challenge the entire system&quot;, what does that even mean exactly?
      • 3eb7988a16634 hours ago
        It was worse than that, the reporting from an earlier story[0]<p><pre><code> ...Unable to pay her bills from jail, she lost her home, her car and even her dog. </code></pre> There is not a jury in the country that will side against the woman. I am not even sure who will make the best pop culture mashup - John Wick or a country song writer?<p>(Also, what happened to journalism - no Oxford comma?)<p>[0] <a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=47356968">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=47356968</a>
        • cguess3 hours ago
          As an aside AP Style is not use an Oxford comma, and that&#x27;s been the rule for 50+ years <a href="https:&#x2F;&#x2F;www.prnewsonline.com&#x2F;explainer-how-to-use-oxford-commas-per-ap-style&#x2F;" rel="nofollow">https:&#x2F;&#x2F;www.prnewsonline.com&#x2F;explainer-how-to-use-oxford-com...</a>
          • 3eb7988a16633 hours ago
            This is upsetting.
            • erikerikson2 hours ago
              Yes, finding out how badly wrong you were is never fun. Of course the lack of ubiquitous Oxford comma use is itself and separately displeasing.
          • krferriter2 hours ago
            AP Style is simply wrong on this, then.
        • segmondy3 hours ago
          You have more faith in the country than I do.
        • frankharv3 hours ago
          Indeed let out on Christmas Eve with no money 1000 miles from your homeland.<p>Where your home was lost to foreclosure because one JUDGE did not look at the paperwork.<p>There should be a way to personally sue somebody when they don&#x27;t do their job. Protecting the innocent. The JUDGE failed badly here.<p>Flimsy evidence would mean no warrant. Do your basic investigation please... Rubberstamping JUDGE caused this.<p>Why are they not named? Like they are a spectator. Infact they are the cause.
        • redeeman1 hour ago
          anyone in the chain of responsibility should be punished so severely that they will be still crying about it in 2030
    • themafia33 minutes ago
      &gt; Whether it&#x27;s AI that flagged her<p>It absolutely was. There&#x27;s no question of this. Now we need to ask how was the system marketed, what did the police pay for it, how were they trained to use it?<p>&gt; anybody bothered to ask her &quot;where were you the morning of july 10th between 3 and 4pm.<p>Legally that amounts &quot;hearsay&quot; and cannot have any value. Those statements probably won&#x27;t even be admissible in court without other supporting facts entered in first.<p>&gt; we are all guilty until cleared.<p>This is not at a phenomenon that started with AI. If you scratch the surface, even slightly, you&#x27;ll find that this is a common strategy used against defendants who are perceived as not being financially or logistically capable of defending themselves.<p>We have a private prison industry. The line between these two outcomes is very short.
      • LocalH4 minutes ago
        &gt;Legally that amounts &quot;hearsay&quot; and cannot have any value.<p>How is that hearsay if she&#x27;s directly testifying to her own whereabouts?<p>Hearsay would be if someone <i>else</i> was testifying &quot;she was in X location on july 10th between 3 and 4pm&quot;, <i>without</i> the accused being available for cross
    • tmpz221 hour ago
      IANAL but AFAIK custodial interrogation triggers Miranda, lawyers, and those awful awful civil liberties we’re trying to get rid of.<p>Better just to apply Musk or Altman software to the problem and avoid it entirely.
  • garethsprice2 hours ago
    The vendor they used, Clearview AI, does not allow you to request data deletion unless you live in one of the half-dozen states that legally mandate it.<p><a href="https:&#x2F;&#x2F;www.clearview.ai&#x2F;privacy-and-requests" rel="nofollow">https:&#x2F;&#x2F;www.clearview.ai&#x2F;privacy-and-requests</a><p>I have suddenly becomes very interested in New York&#x27;s S1422 Biometric Privacy Act.
    • dawnerd2 hours ago
      Sadly this is really the only tool we have right now. Just have to keep spamming them with delete requests because once they delete it’ll end up back in their database eventually.
    • KomoD1 hour ago
      related <a href="https:&#x2F;&#x2F;noyb.eu&#x2F;en&#x2F;criminal-complaint-against-facial-recognition-company-clearview-ai" rel="nofollow">https:&#x2F;&#x2F;noyb.eu&#x2F;en&#x2F;criminal-complaint-against-facial-recogni...</a>
    • guelo1 hour ago
      To get your data deleted in the states that require it you have to submit a photo of yourself which I really don&#x27;t want to do for a sketchy company with ties to evil billionaire Peter Thiel.
  • tlogan5 hours ago
    This is a weak or misleading story about AI.<p>First, the detective used the FaceSketchID system, which has been around since around 2014. It is not new or uniquely tied to modern AI.<p>Second, the system only suggests possible matches. It is still up to the detective to investigate further and decide whether to pursue charges. And then it is up to court to issue the warrant.<p>The real question is why she was held in jail for four months. That is the part that I do not understand. My understanding is that there is 30-day limit (the requesting state must pick up the defendant within 30 day). Regarding the individual involved, Angela Lipps, she has reportedly been arrested before, so it is possible she was on parole. So maybe they were holding her because of that?<p>Can someone clarify how that process works?
    • suzzer994 hours ago
      In the US there are no consequences for people in power failing to follow procedures, laws or regulations - except for being told to stop doing whatever illegal thing they&#x27;re doing, and possibly getting sued way down the line, which gets paid by taxpayers.
      • tlogan4 hours ago
        From reading more into the case, it seems the issue may be related to how her lawyer handled the case.<p>They probably did “identity challenge” arguing that she is not the right person. But from Tennessee’s perspective, she was considered the correct person to be arrested, so there was no “mistaken identity” in their system. In other words, North Dakota Wanted person x and here is person x.<p>Once a judge in North Dakota reviewed the full evidence (and found that person they issued warrant for arrest is not one they want), the case was dismissed.
        • frankharv3 hours ago
          Yes but a judge issued the warrant in the first place.<p>Cops did not do a proper investigation and the judge green-lighted it.<p>It is all on the JUDGE or possibly a magistrate who approved a faulty warrant.<p>The judge failed the poor woman. FIRE him.<p>Then sue Clearview for big bucks.
          • tlogan1 hour ago
            The judge likely issued the warrant based on the detective’s sworn testimony. In most cases, a judge does not have the ability or detailed knowledge to independently verify whether the detective completed all necessary checks.<p>This situation likely resulted from either sloppy investigative work or an honest mistake: the detective believed her booking photo matched the individual captured on camera.<p>Her booking photo from a prior arrest can be found here: <a href="https:&#x2F;&#x2F;mugshots.com&#x2F;US-States&#x2F;Tennessee&#x2F;Carter-County-TN&#x2F;Angela-Robin-Lipps.173032553.html" rel="nofollow">https:&#x2F;&#x2F;mugshots.com&#x2F;US-States&#x2F;Tennessee&#x2F;Carter-County-TN&#x2F;An...</a><p>Do we have recording of the suspect they used for the match?
            • natebc1 hour ago
              There&#x27;s a screenshot from a security camera with the _actual_ suspect in the article that was posted to HN last week.<p><a href="https:&#x2F;&#x2F;www.grandforksherald.com&#x2F;news&#x2F;north-dakota&#x2F;ai-error-jails-innocent-grandmother-for-months-in-north-dakota-fraud-case" rel="nofollow">https:&#x2F;&#x2F;www.grandforksherald.com&#x2F;news&#x2F;north-dakota&#x2F;ai-error-...</a>
          • contrast2 hours ago
            what&#x27;s with the weird obsession all over the thread that it is the JUDGE who is the only person at fault here?
            • wl12 minutes ago
              Especially considering the judge is the only person involved in this who is completely immune from being sued.
            • lovich47 minutes ago
              It’s the same poster, I assumed they were ai at first but the account is from 2017.<p>Some people are just weird
      • AnthonyMouse2 hours ago
        [deleted]
        • everforward2 hours ago
          This isn’t how it works, you can invoke your right to a speedy trial at any point you want. You can spend 2 months waiting and then invoke it if you want.<p>The timer starts from when you invoke it, though.<p>The 2 issues, which she may be caught in, are that it’s “speedy” from the perspective of a court, and that it really means “free from undue delays”.<p>There is no general definition of a speedy trial, but I think the shortest period any state defines is a month (with some states considering several months to still be “speedy”).<p>A trial can still be speedy even past that window if the prosecution can make a case that they genuinely need more time (like waiting for lab tests to come back).<p>It’s basically only ever not speedy if the prosecution is just not doing anything.
        • gamblor9561 hour ago
          <i>You get charged with something and if you want to have the trial right now, before you have any idea what&#x27;s going on, then you can insist, which basically nobody does because it&#x27;s pretty crazy to go in blind</i><p>Actually most criminal defense attorneys recommend not waiving your speedy trial rights. Yes, the defense goes in blind. But so does the prosecution, and they&#x27;re the ones that have to make a case.<p>The usual result for defendants that don&#x27;t waive their speedy trial rights is an acquittal if the case goes to trial (between 50-60%), which doesn&#x27;t sound like a lot but prosecutors are expected to win &gt;90% of their trials. Additionally, in many counties they don&#x27;t have sufficient courtrooms to handle all the criminal trials within the speedy trial timeframe, so if the trial date comes and a courtroom is not available the case is dismissed <i>with prejudice</i>. Nonviolent misdeameanors are the lowest priority for a courtroom (and by that I mean even family law cases have priority over nonviolent misdos in most counties), so those cases are frequently dismissed a day or two before the trial date. Consequently, most prosecutors will offer better and better plea bargains as the trial date approaches.<p>This is even more true for murders, which is why murder suspects don&#x27;t usually get charged for a year or two after the crime.
          • AnthonyMouse1 hour ago
            Apparently I set up a unit test for Cunningham&#x27;s Law today.
    • SpicyLemonZest58 minutes ago
      &gt; The real question is why she was held in jail for four months. That is the part that I do not understand. My understanding is that there is 30-day limit (the requesting state must pick up the defendant within 30 day). Regarding the individual involved, Angela Lipps, she has reportedly been arrested before, so it is possible she was on parole. So maybe they were holding her because of that?<p>As the article gestures towards, challenging the extradition can greatly extend the timeline, from 30 days after the arrest to 90 days after a formal identity hearing. Which isn&#x27;t fair and isn&#x27;t intuitive, but is unfortunately a long-standing part of the system. (Even worse, this kind of mistaken identity <i>can&#x27;t</i> be challenged in an extradition hearing; the question isn&#x27;t whether she&#x27;s the person who committed the crime but whether she&#x27;s the person identified in the warrant.)
      • tlogan40 minutes ago
        That is my assumption. I assume who ever was representing her made a mistake and challenged the warrant and that caused delay in the extradition.
    • strictnein3 hours ago
      I wish I could find the link, but I believe she was in jail on parole violation, unrelated to anything that the &quot;AI&quot; flagged her on.
      • Supermancho3 hours ago
        Her picture was used as part of a fake id card, in the commission of a crime. The fuzzy camera footage looked like her (from stills I&#x27;ve seen) and her picture was on the fake ID. Those 2 circumstantial items were, apparently, enough to have a warrant issued.<p>They picked her up in TN and held her for 4 months, even after:<p>The ND police knew the ID was fake and the person using it was not her. The ND police knew she had been in TN before, during, and after the crime.<p>She is still technically a suspect, even after all of this has come out.
        • tlogan48 minutes ago
          Ok. The mistake was made by North Dakota police (and they blame AI - the AI just gave them a possible match. Whatever.).<p>What I still do not understand is why she spent nearly six months in a Tennessee jail. That part remains unclear and needs further explanation.
          • p_l19 minutes ago
            From the first time the story surfaced, for spurious reasons[1] she was booked as fugitive, and that made it so that there was &quot;no need&quot; for normal timeframe of hearing.<p>[1] The reason being that she was found in Tennessee while being searched for a crime in another state, thus allowing them to treat it as interstate fugitive from a crime scene
      • zoklet-enjoyer3 hours ago
        She was not<p>Source: I live in Fargo and have been following this story closely. Everyone here is pissed
        • frankharv3 hours ago
          Thanks for clarifying.<p>I wonder who is slandering her more... WOW<p>Maybe the citys insurance carrier hired a FIRM...<p>They will be taking a hit.
      • frankharv3 hours ago
        That is the first I have heard of that. A small unexplained blurb in this article. Already in jail on parole violation..<p>Maybe she objected to the extradition order without good counsel.<p>&quot;I aint never been to N.Dakota&quot;. She found out the hard way how the law works..<p>What about the banks being hit. Surely they have good cameras. This was bad mojo. I would think a Wells Fargo&#x2F;BoA has a unit for this stuff.<p>Finincial crimes handled like this. The banks will be sued too I suspect.. Deep pockets settle out.
    • georgemcbay2 hours ago
      &gt; It is still up to the detective to investigate further and decide whether to pursue charges. And then it is up to court to issue the warrant.<p>This is how it should work, but I still think it is important to discuss these failures in the context of AI risks.<p>One of the largest real-world dangers of AI (as we define that now) is that it is often confidently wrong and this is a terrible situation when it comes to human factors.<p>A lot of people are wired in such a way that perceived confidence hacks right through their amygdala and they immediately default to trust, no matter how unwarranted.
  • mitchbob6 hours ago
    Earlier discussion (405 comments):<p><a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=47356968">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=47356968</a>
  • oopsiremembered5 hours ago
    Money quote from someone quoted in the article:<p>&quot;[I]t’s not just a technology problem, it’s a technology and people problem.&quot;<p>I can&#x27;t. I just can&#x27;t.
    • bryanrasmussen4 hours ago
      I&#x27;ve been hearing &quot;it&#x27;s not just... it&#x27;s a&quot; touted as an AI sign recently, personally I think it&#x27;s an AI sign because it&#x27;s a human thinking shortcut sign, and AI copies it, but it would be funny if AI wrote the article and then hallucinated this specific money quote.
      • oopsiremembered4 hours ago
        I doubt this happened here, but FWIW, AI does have a habit of &quot;cleaning up&quot; (read: hallucinating) interview transcript quotes if you ask it to go through a transcript and pull quotes. You have to prompt AI very specifically to get it to not &quot;clean up&quot; the quotes when you ask it to do that task.
        • b1123 hours ago
          And why not?<p>If you look at examples of people quoting on the internet, lots are out of context, paraphrased, or made up.<p>AI is just mimicking what it has seen.
  • mememememememo1 hour ago
    Wow thought the bar for probable cause for an arrest warrant would be much higher. Especially to drag soneone from another state.
  • indigodaddy2 hours ago
    A lot of dumb shit happens in this arena, where if you had just one smart cop, it could have been prevented. Here’s one from 2023:<p><a href="https:&#x2F;&#x2F;youtu.be&#x2F;lPUBXN2Fd_E" rel="nofollow">https:&#x2F;&#x2F;youtu.be&#x2F;lPUBXN2Fd_E</a>
  • jqpabc1236 hours ago
    AI is a liability issue waiting to happen. And this is just another example.
    • gtowey5 hours ago
      It&#x27;s the opposite, it&#x27;s absolution from liability. &quot;The AI did it&quot; is the ultimate excuse to avoid accepting responsibility and consequences.
      • jqpabc1235 hours ago
        Courts are already refusing to accept this excuse.<p><a href="https:&#x2F;&#x2F;pub.towardsai.net&#x2F;the-air-gapped-chronicles-the-court-asked-for-the-llms-reasoning-48471090eada" rel="nofollow">https:&#x2F;&#x2F;pub.towardsai.net&#x2F;the-air-gapped-chronicles-the-cour...</a>
        • gtowey41 minutes ago
          Good to know it&#x27;s not a fait accompli yet, but I wont be surprised to see corporations pushing for this hard.
      • Hizonner5 hours ago
        ... which is why the institutions that assign responsibility and consequences need to make it really clear that excuse won&#x27;t fly. With illustrative examples.
    • garyfirestorm6 hours ago
      It’s a tool. Used incorrectly will lead to errors. Just like a hammer, used incorrectly could hit the users finger.
      • happytoexplain6 hours ago
        There is enormous variability in how hard a tool is to use correctly, how likely it is to go wrong, and how severe the consequences are. AI has a wide range on all those variables because its use cases vary so widely compared to a hammer.<p>The use case here is police facial recognition. Not hitting nails. The parent wasn&#x27;t saying &quot;AI is a liability&quot; with no context.
        • mikkupikku5 hours ago
          When somebody uses a tool to hurt somebody, they need to be held accountable. If I smack you with a hammer, that needs to be prosecuted. Using AI is no different.<p>The problem here is incidental to the tool; it was done by the cops and therefore nobody will be held accountable.
          • tovej5 hours ago
            Systems are also a tool. Whoever institutes and helps build the system that systematically results in harm is also responsible.<p>That would be the vendors, the system planners, and the institutions that greenlit this. It would also include the larger financial tech circle that is trying to drive large scale AI adoption. Like Peter Thiel, who sees technology as an &quot;alternative to politics&quot;. I.e. a way to circumvent democracy [1]<p>[1] <a href="https:&#x2F;&#x2F;stavroulapabst.substack.com&#x2F;p&#x2F;techxgeopolitics-18-technology-as" rel="nofollow">https:&#x2F;&#x2F;stavroulapabst.substack.com&#x2F;p&#x2F;techxgeopolitics-18-te...</a>
      • tgv5 hours ago
        This tool, however, is specifically built for mass surveillance. It serves no other purpose. The tool is broken, and everybody knows it. The tool makers are at least as guilty as those who use it.
        • cyanydeez5 hours ago
          The tool, like Google search, is likely biased towards returning results regardless of confidence.
      • jqpabc1235 hours ago
        <i>Used incorrectly will lead to errors.</i><p>Only one small little problem --- there is no way to tell if you are using it &quot;correctly&quot;.<p>The only way to be sure is to not use it.<p>Using it basically boils down to, &quot;Do you feel lucky?&quot;.<p>The Fargo police didn&#x27;t get lucky in this case. And now the liability kicks in.
        • nkrisc5 hours ago
          Some basic investigatory police work (the kind they did before AI) would have revealed the mistake before an innocent woman’s life was destroyed.
          • jqpabc1235 hours ago
            Yes. But doing the investigation negates much of the incentive for using AI.<p>Look for similar to play out elsewhere --- using unreliable tools for decision making is not a good, responsible business plan. And lawyers are just waiting to press the point.
            • nkrisc4 hours ago
              In this case it sounds as though AI could have been used to generate preliminary leads. When someone calls a tip line with information, police don’t just take their word for it, they investigate it. They know that tips they receive may be incorrect. They should have done the exact same here, but they didn’t.<p>I’m very opposed to AI in general, but this one is clearly human failure.<p>The noteworthy AI angle is the undeserved credence police gave to AI information. But that is ultimately their failure; they should be investigating all information they receive.
              • jqpabc1234 hours ago
                <i>...but this one is clearly human failure.</i><p>Absolutely.<p>The failure starts with tool vendors who market these statistical&#x2F;probabilistic pattern searchers as &quot;intelligent&quot;. The Fargo police failed to fully evaluate these marketing claims before applying them to their work.<p>So in the same way that the failure rolled down hill, liability needs to roll back up.
            • bornfreddy5 hours ago
              AI can provide leads. Someone still needs to verify them and decide.
              • jqpabc1234 hours ago
                Generating and verifying bad leads costs money. Not verifying bad leads can cost much more.<p>At some point, you have to decide if wasting good money on bad intel makes sense.
          • SpicyLemonZest1 hour ago
            The article says that the Fargo police claimed to have done &quot;additional investigative steps independent of AI&quot;. (Perhaps they&#x27;re lying, or did a poor job because they thought the extra steps were a formality.)
        • jfengel5 hours ago
          Now the &quot;qualified&quot; immunity kicks in.
          • jqpabc1235 hours ago
            We will find out. But relying on AI is likely to cost the city of Fargo in one way or another. They say they have already stopped using AI and returned to good old fashioned human investigation.<p><a href="https:&#x2F;&#x2F;www.lawlegalhub.com&#x2F;how-much-is-a-wrongful-arrest-lawsuit-worth&#x2F;" rel="nofollow">https:&#x2F;&#x2F;www.lawlegalhub.com&#x2F;how-much-is-a-wrongful-arrest-la...</a>
        • zephen2 hours ago
          Look, I&#x27;m generally considered AI&#x27;s most vociferous detractor.<p>But...<p>&gt; there is no way to tell if you are using it &quot;correctly&quot;.<p>This simply isn&#x27;t true, at least in cases like this.<p>I know common sense isn&#x27;t really all that common, but why would you give more credence to an untested tool than an untested crack-addled human informant?<p>The entire point of the informant, or the AI in this instance, is to generate <i>leads.</i> Which subsequently need to be checked.
          • jqpabc12350 minutes ago
            There is no &quot;correct&quot; way to use AI in order to avoid bad results. The only prudent approach is to assume all results are bad until proven otherwise.<p>But this approach negates much of the incentive to pay for questionable results.
      • MattDaEskimo5 hours ago
        What kind of outcome results from misuse? Clearly a hammer&#x27;s misuse has very little in common with a global, hivemind network used in high-stake campaigns.<p>Now, if I misused a hammer and it hurt everyone&#x27;s thumb in my country, then maybe what you said would have some merit.<p>Otherwise, I&#x27;d say it&#x27;s an extremely lazy argument
      • hrimfaxi3 hours ago
        Unlike hammers people preface things with &quot;claude says&quot;, etc. I never see that kind of distancing with tools that aren&#x27;t AI.
      • suzzer995 hours ago
        Dynamite is a tool. But we don&#x27;t hand it out to anyone who wants to play with it.
        • mikkupikku5 hours ago
          We used to until quite recently. Anybody could buy dynamite at the hardware store. We had to end this because of criminals using it to hurt people.
          • suzzer994 hours ago
            I admit I was surprised to see you could buy dynamite in a hardware store until 1970.
          • jqpabc1235 hours ago
            Look for AI to follow a similar trajectory over time.
            • GaryBluto5 hours ago
              Impossible at this point. You cannot download dynamite.
            • mikkupikku5 hours ago
              Yes, regulation is inevitable.
              • jfengel5 hours ago
                Regulation is impossible. The AI barons literally control the federal government, so not even state regulations get tried.
          • jfengel5 hours ago
            Except this time the criminals are police.
            • AngryData4 hours ago
              They are far more often than anyone wants to admit. That&#x27;s how we got 25% of the world&#x27;s prison population.
              • rootusrootus3 hours ago
                AFAIK the actual cause for our high incarceration rate is that we have longer sentences. The conviction rate, for example, as compared to the UK is similar.
      • skeeter20205 hours ago
        AI feels closer to a firearm than a hammer when accessing law enforcement&#x27;s ability to quickly do massive, unrecoverable harm.
  • jeremie_strand58 minutes ago
    [dead]
  • ValveFan69691 hour ago
    [dead]
  • casey26 hours ago
    [flagged]
    • suzzer994 hours ago
      I would say much more likely that it was because she was poor and couldn&#x27;t afford a good lawyer.
      • AngryData4 hours ago
        This, she likely had a shitty public defender that did the bare minimum requirements because they were catering to paying clients. The state was playing hardball because they wanted to make a profit off the poor person with a shitty defense and the public defender was sitting on the bench at a teeball tournament because they werent getting paid enough and didn&#x27;t want to try.
    • IncreasePosts4 hours ago
      What? Women are much more sympathetic figures when it comes to crime and punishment. And there are 10x more men in prison in america than women. If you were trying to &quot;introduce&quot; some nefarious law enforcement system to the US you would use it on undesirable men first (drug addicts and gang members)
    • jstanley5 hours ago
      You think they deliberately chose to do this to a woman? Why?
      • cyanydeez5 hours ago
        Probably just reading the room, with States like texas making abortions illegal and allowing random citizens from enforcing that.<p>Famously, abortions are a woman thing.<p>Anyway, looking through the facts, it&#x27;s just some random woman. There&#x27;s better evidence that these facial recognition systems are much worse at minorities rather than genders.<p>Interesting biases are own-gendeR: <a href="https:&#x2F;&#x2F;pmc.ncbi.nlm.nih.gov&#x2F;articles&#x2F;PMC11841357&#x2F;" rel="nofollow">https:&#x2F;&#x2F;pmc.ncbi.nlm.nih.gov&#x2F;articles&#x2F;PMC11841357&#x2F;</a><p>Racial bias:<p><a href="https:&#x2F;&#x2F;mitsloan.mit.edu&#x2F;ideas-made-to-matter&#x2F;unmasking-bias-facial-recognition-algorithms" rel="nofollow">https:&#x2F;&#x2F;mitsloan.mit.edu&#x2F;ideas-made-to-matter&#x2F;unmasking-bias...</a><p>Miss rates:<p><a href="https:&#x2F;&#x2F;par.nsf.gov&#x2F;servlets&#x2F;purl&#x2F;10358566" rel="nofollow">https:&#x2F;&#x2F;par.nsf.gov&#x2F;servlets&#x2F;purl&#x2F;10358566</a><p>Although you can probably interpret the facts differently, we&#x27;ve seen how any search function gets enshittified: Once people get used to searching for things, they tend to select something that returns results vs something that fails to return results.<p>Rather than the user blaming themselves, they blame the searcher. As such, any search system overtime will bias towards returning search (eg, Outlook), rather than accuracy.<p>So if these systems easily miss certain classes of people, women, minorities, they&#x27;ll more likely be surfaced as inaccurate matches rather than men who&#x27;ll have a higher confidence of being screened out.<p>That&#x27;s how I interpret this 2 second commment.
  • renewiltord4 hours ago
    [flagged]
    • rootusrootus3 hours ago
      Has it not been fairly common to require police officers to have a bachelor’s degree? Or an associate’s? I think recently that has been relaxed but I’ve lived in places where it was absolutely a requirement.<p>I don’t think they’re as stupid as you suggest.
      • voakbasda2 hours ago
        Police departments are known to avoid hiring people that get high marks in school, under the principle that such individuals will become bored with the job and quit. They literally look for average people with average intelligence: C students.<p>Now factor in the slow decline of our educational institutions, where grade inflation has systematically diminished the credibility of a degree. I would wager that many C students today would have failed out completely 30 years ago.<p>In that light, it is not surprising that people are seeing ICE agents behave like brown shirts. No one in power wants those people asking any kind of hard questions about what they are being ordered to do.
      • lovich40 minutes ago
        Police departments won the right to discriminate _against_ intelligence in 1997 on the Jordan vs The City of New London case[1].<p>They literally aim to be dumber than average.<p>[1] <a href="https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Wonderlic_test" rel="nofollow">https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Wonderlic_test</a>
      • llbbdd3 hours ago
        Having a degree is a very low bar for intelligence.
      • zephen2 hours ago
        &gt; I don’t think they’re as stupid as you suggest.<p>I&#x27;ll just leave this here:<p><a href="https:&#x2F;&#x2F;abcnews.com&#x2F;US&#x2F;court-oks-barring-high-iqs-cops&#x2F;story?id=95836" rel="nofollow">https:&#x2F;&#x2F;abcnews.com&#x2F;US&#x2F;court-oks-barring-high-iqs-cops&#x2F;story...</a>
        • rootusrootus1 hour ago
          It says the national average is just slightly above average IQ. Sounds fine.
          • zephen1 hour ago
            &gt; Sounds fine.<p>Really? Maybe your perception of the &quot;average&quot; person is colored by where you live and who you interact with.<p>In any case, the dumber they are, the more lethal they are.<p><a href="https:&#x2F;&#x2F;www.sciencedirect.com&#x2F;science&#x2F;article&#x2F;abs&#x2F;pii&#x2F;S0160289618300734" rel="nofollow">https:&#x2F;&#x2F;www.sciencedirect.com&#x2F;science&#x2F;article&#x2F;abs&#x2F;pii&#x2F;S01602...</a>
  • giardini2 hours ago
    This has been posted at least twice before on HN.