14 comments

  • tass55 minutes ago
    I’m not usually an apologist, and I’d agree with this judgement if the car was left to its own devices, but the driver of the car held his foot on the accelerator which is why it blew through those stop signs and lights.<p>In regards to the autopilot branding, would a reasonable person expect a plane on autopilot to fly safely if the pilot suddenly took over and pointed it at the ground?
    • Gud1 minute ago
      A “reasonable person” in a cockpit is not the same as a “reasonable person” behind the steering wheel.<p>Pilots undergo rigorous training with exam after exam they must pass.<p>No one is handed the keys to a Boeing 747 after some weekly evening course and an hours driving test.
    • jrjeksjd8d48 minutes ago
      The average person does not know how to fly a plane or what a plane autopilot does. It&#x27;s a ridiculous superficial comparison. Planes have professional pilots who understand the capabilities and limits of aviation autopilot technology.<p>Tesla has had it both ways for ages - their stock price was based on &quot;self-driving cars&quot; and their liability was based on &quot;asterisk asterisk the car cannot drive itself&quot;.
      • nitinreddy8836 minutes ago
        According to your analogy. Certified pilot = Certified driving license holder. Its not like Tesla is advertising non driving license or in eligible person can drive using Autopilot. I wonder how can you even justify your statement
        • tapoxi23 minutes ago
          Autopilot is part of a private pilots license and systems are approved by the FAA. Tesla autopilot isn&#x27;t part of a driving license, nor did it undergo review by the NHTSA prior to launch because Elon considered it &quot;legal by default&quot;.
      • nickff42 minutes ago
        If the average person does not know what an autopilot does, why would they expect Tesla&#x27;s &#x27;autopilot&#x27; to take such good care of them? I am reminded of a case many years ago when a man turned on the cruise control in his RV and went to the back to make himself lunch, after which the RV went off some sort of hill or cliff.<p>Rudimentary &#x27;autopilots&#x27; on aircraft have existed for about a century now, and the earlier versions (before transistorization) only controlled heading and attitude (if conditions and other settings allowed it), with little indication of failure.
  • xiphias21 hour ago
    This case will make settlement amounts higher, which is the main thing car companies care about when making decisions about driving features&#x2F;marketing.<p>With Robotaxi it will get even higher as it will be clear 100% the company&#x27;s fault.
    • 1970-01-011 hour ago
      Fight Club 2.0: You pay to retrain it only if the AI will kill more people than our settlement fund can pay out.
      • coredog6458 minutes ago
        You&#x27;re already downvoted, but this quote from Fight Club always annoyed me as it misunderstands how recalls work.<p>1. Insurance companies price in the risk, and insurance pricing absolutely influences manufacturers (see the absolute crap that the Big 3 sold in the 70s) 2. The government can force a recall based on a flaw whether or not the manufacturer agrees
        • 1970-01-0155 minutes ago
          v2.0- Tesla drivers insure with Tesla and the recalls are all OTA software fixes.
  • jqpabc1232 hours ago
    Tesla Apologists: The judge&#x2F;jury agreed that Tesla was &quot;Full Self Driving&quot; all the way to the scene of the crash.
    • dekhn1 hour ago
      If I read the article it says autopilot, not FSD.
      • palmotea1 hour ago
        &gt; If I read the article it says autopilot, not FSD.<p>What&#x27;s the difference? And does it matter?<p>Both are misleadingly named, per the OP:<p>&gt; In December 2025, a California judge ruled that Tesla’s use of “Autopilot” in its marketing was misleading and violated state law, calling “Full Self-Driving” a name that is “actually, unambiguously false.”<p>&gt; Just this week, Tesla avoided a 30-day California sales suspension only by agreeing to drop the “Autopilot” branding entirely. Tesla has since discontinued Autopilot as a standalone product in the U.S. and Canada.<p>&gt; This lands weight to one of the main arguments used in lawsuits since the landmark case: Tesla has been misleading customers into thinking that its driver assist features (Autopilot and FSD) are more capable than they are – leading drivers to pay less attention.
        • dekhn45 minutes ago
          Autopilot is similar to cruise control that is aware of other cars, and lane keeping. I would fully expect the sort of accident that happened to happen (drop phone, stop controlling vehicle, it continues through an intersection).<p>FSD has much more sophisticated features, explicitly handling traffic stops and lights. I would not expect the sort of accident to happen with FSD.<p>The fact that Tesla misleads consumers is a different issue from Autopilot and FSD being different.
          • jqpabc1230 minutes ago
            <i>Autopilot is similar to cruise control that is aware of other cars, and lane keeping.</i><p>Thanks for explaining why labeling it &quot;Autopilot&quot; is deceptive.
        • atonse51 minutes ago
          I remember having this argument with a friend.<p>My argument was that the idea that the name Autopilot is misleading comes not from Tesla naming it wrong, it comes from what most people think &quot;Autopilots&quot; on an aircraft do. (And that is probably good enough to argue in court, that it doesn&#x27;t matter what&#x27;s factually correct, it matters what people understand based on their knowledge)<p>Autopilot on a Tesla historically did two things - traffic aware cruise control (keeps a gap from the car in front of you) and stays in its lane. If you tell it to, it can suggest and change lanes. In some cases, it&#x27;ll also take an exit ramp. (which was called Navigate on Autopilot)<p>Autopilots on planes roughly also do the same. They keep speed and heading, and will also change heading to follow a GPS flight plan. Pilots still take off and land the plane. (Like Tesla drivers still get you on the highway and off).<p>Full Self Driving (to which they&#x27;ve now added the word &quot;Supervised&quot; probably from court cases but it always was quite obvious that it was supervised, you had to keep shaking the steering wheel to prove you were alert, same as with Autopilot btw), is a different AI model that even stops at traffic lights, navigates parking lots, everything. That&#x27;s the true &quot;summon my car from LA to NY&quot; dream at least.<p>So to answer your question, &quot;What&#x27;s the difference&quot; – it&#x27;s huge. And I think they&#x27;ve covered that in earlier court cases.<p>But one could argue that maybe they should&#x27;ve restricted it to only highways maybe? (fewer traffic lights, no intersections), but I don&#x27;t know the details of each recent crash.
          • Retric44 minutes ago
            Autopilots do a lot more than that because flying an aircraft safely is a lot more complicated than turning a steering wheel left and right and accelerating or breaking.<p>Tesla’s Autopilot being unable to swap from one road to another makes is way less capable than a decades old civilian autopilots which will get you to any arbitrary location as long as you have fuel. Calling the current FSD Autopilot would be overstating its capabilities, but reasonably fitting.
            • beering34 minutes ago
              Doesn’t basic airplane autopilot just maintain flight level, speed, and heading? What are some other things it can do?
              • Retric12 minutes ago
                Recover from upsets is the big thing. Maintaining flight level, speed, and heading while upside down isn’t acceptable.<p>Levels of safety are another consideration, car autopilot’s don’t use multiple levels of redundancy on everything because they can stop without falling out of the sky.
          • roywiggins41 minutes ago
            Airplane &quot;autoland&quot; goes back a ways:<p><a href="https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Autoland" rel="nofollow">https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Autoland</a>
    • keeganpoppen49 minutes ago
      well the other person in the comments said the guy literally held his accelerator to the floor the entire time. is that actually a reasonable standard, or are you preemptively out for blood because you would never let reality get in the way of a good agenda? ironic, given that you go out of your way to accuse others of this. methinks you doth protest too much?
      • selridge46 minutes ago
        Hope he sees this, bro
  • maxdo42 minutes ago
    I&#x27;m so lost. The guy decided to pick up the phone from the floor while driving the car at high speed.<p>1. It could be ANY car with similar at that time auto steer capabilities. 2. Why the hate , because of some false promise ? Because as of today same car would save the guy in exact same situation, because FSD now handles red lights perfectly. Far better and safer vs ANY other tech included in the avg car price of same segment ( $40-50k).
    • madsmith37 minutes ago
      Not sure if it’s using the same FSD decision matrix but my model S chimed at me to drive into the intersection while sitting at a red light Last night with absolutely zero possibility it saw a green light anywhere in the intersection.<p>Perfectly isn’t a descriptor I would use. But this is just anecdotal.
    • SpicyLemonZest2 minutes ago
      As the source article says, the jury did agree that the driver was mostly liable. They found Tesla partially liable because they felt that Tesla&#x27;s false promise led to the driver picking up his phone. If they&#x27;d been more honest about the limitations of their Autopilot system, as other companies are about their assisted driving functionalities, the driver might have realized that he needed to stop the car before picking up his phone.
    • BugsJustFindMe22 minutes ago
      &gt; <i>Why the hate , because of some false promise ?</i><p>Another name for &quot;false promise&quot; when made for capital gain is &quot;fraud&quot;. And when the fraud is in the context of vehicular autonomy, it becomes &quot;fraud with reckless endangerment&quot;. And when it leads to someone&#x27;s death, that makes it &quot;proximate cause to manslaughter&quot;.
  • robotnikman42 minutes ago
    Will this have any effect on other companies developing self driving tech? It sets a very high precedent for fines, and may discourage companies from further working on such tech.
    • janalsncm26 minutes ago
      Developing, no, but once companies start releasing vehicles onto our shared public streets I have a lot less tolerance for launching science experiments that end up killing bystanders.<p>I can understand the argument that in the abstract over-regulation kills innovation but at the same time in the US the pendulum has swung so far in the other direction that it’s time for a correction.
    • mmooss38 minutes ago
      That&#x27;s an old argument by corporations against liability. Should they not be fully liable?<p>It should discourage them from making unsafe products. If it&#x27;s not economical for them to make safe products, it&#x27;s good that they go bankrupt and the economic resources - talent, money - go to someone else. Bankruptcy and business failure are just as fundamental to capitalism as profit.
      • nickff26 minutes ago
        These product-liability lawsuits are out of control; perhaps this judgement is directionally correct, but the punitive damages seem insane. This reminds me of the lawsuits which drove Instant Pot bankrupt, where the users were clearly doing very stupid things, and suffered injuries because they were able to physically overpower the safety mechanisms on their pressure-cookers.
        • mmooss6 minutes ago
          &gt; These product-liability lawsuits are out of control<p>Businesses also claim that, all the time. We need some evidence.<p>I remember doctors claiming that malpractice lawsuits were out of control; research I read said that it wasn&#x27;t an economic issue for doctors and that malpractice was out of control.
  • bsimpson50 minutes ago
    Good<p>It seems clear that &quot;autopilot&quot; was a boisterous overclaim of its capabilities that led to people dying.<p>It may be minorly absurd to win founder-IPO-level wealth in a lawsuit, but it&#x27;s also clear that smaller numbers don&#x27;t act as an effective deterrent to people like Elon Musk.
    • eYrKEC222 minutes ago
      Right! We demand engineering perfection! No autopilot until we guarantee it will <i>NEVER</i> kill a soul. Don&#x27;t worry that human drivers kill humans all the time. The rubric is not better than a human driver, it is an Angelic Driver. Perfection is what we demand.
  • josefritzishere1 hour ago
    Tesla would benefit from the board replacing the CEO. It&#x27;s increasingly clear that there is a problem and it&#x27;s not talent, it&#x27;s decision-making.
    • dolphinscorpion1 hour ago
      Their stock would crash to $10 without the hype machine
      • mmooss1 hour ago
        Après moi, le déluge.
      • breakyerself1 hour ago
        It will be zero if they keep doing the same shit
        • pm9043 minutes ago
          Ultimately, I believe there will need to be something catastrophic to oust musk&#x2F;change leadership. And by that point, its questionable if anything worthwhile will be left to salvage.<p>My current bet is that optimus will fail spectacularly and Tesla gets left far behind as Rivian&#x27;s R2 replaces it.<p>One thing I will note: I know folks that work at TSLA. Musk is more of a distraction. If he goes and if competent leadership is brought in, there&#x27;s still enough people and momentum to make something happen...
        • madeofpalk35 minutes ago
          Will it actually? Has the market sent any signal that they won’t tolerate Musk?<p>You’re a lot more optimistic about this than I am.
        • maxdo46 minutes ago
          this is literally one of 1-3 companies who have a decent strategy in the age of AI. the rest is pretending changes will not affect them. even this judgement: the guy decided to pick the phone while driving car not capable of red light detection. It could be any other car with similar auto steer capabilities. Right now same car with OTA updates would keep him alive. Sure, they are doing something wrong.
  • standardUser1 hour ago
    I&#x27;m not clear on what Tesla is doing these days. They&#x27;ve been left in the dust on autonomous driving, they&#x27;ve failed to update their successful car models, and their only new model was a spectacular failure.
    • mey1 hour ago
      Ask the CEO? Based on recent incentives and acquisitions, are they planning to remain a car company?
      • palmotea1 hour ago
        &gt;&gt; I&#x27;m not clear on what Tesla is doing these days.<p>&gt; Ask the CEO? Based on recent incentives and acquisitions, are they planning to remain a car company?<p>I believe Musk wants to hype humanoid robots, because he can&#x27;t get away with irrationally hyping electric cars or self-driving technology like you used to.<p>Tesla was never a car company, their real product is sci-fi dreams.
        • blackjack_53 minutes ago
          Agreed, and he’s already behind in humanoid robots, so the hype there won’t last long. The problem is that China is obliterating him at every turn because they actually build things that work instead of just hyping things and saying fake numbers of how much money it could be if every human on the planet bought 20.
    • Almondsetat56 minutes ago
      By which metrics has Tesla been left in the dust wrt autonomous driving? Right now they are the only brand where you can buy a car and have it do basically 90% (or sometimes 100%) of your daily driving. Sure, it&#x27;s supervised, but the alternatives are literally extremely geogated taxis
      • WarmWash30 minutes ago
        Tesla has a level 3 system that it&#x27;s willing to gamble on not needing intervention for a handful of miles for a handful of Tesla fanboys. It&#x27;s very telling that their &quot;level 4&quot; robotaxis are basically unicorns and only exist (existed? it&#x27;s not clear they are even available anymore) in a single neighborhood subsection of the level 3 robotaxis full area in Austin.<p>Waymo on the other hand has a level 4 system, and has for many years, in many cities, with large service areas.<p>Tesla is unquestionably in the dust here, and the delusional, er, faithful are holding out for this mythical switch flip where Elon snaps his fingers and every Tesla turns into a level 4 robotaxi (despite the compute power in these cars being on the level of a GTX 5090, and the robotaxis having custom hardware loadouts)
        • Almondsetat13 minutes ago
          I don&#x27;t understand the point of your reply. Waymo is geofenced taxis. You cannot buy a Waymo. It cannot drive basically wherever you want. Teslas mostly can. So, again, how is Tesla the one left in the dust?
      • bigyabai48 minutes ago
        It&#x27;s not free, is it? You buy the car, subscribe to their arbitrarily-priced subscription service, and <i>then</i> it does 90% of your driving.<p>That&#x27;s like paying for a &quot;self-juicing juicer&quot; that only works with proprietary juice packages sold through an overpriced subscription.<p>Edit: Mostly a criticism. I have no bone to pick with Elon, but subscription slopware is the reason why Chinese EVs are more desirable to average Joes like me.
    • bryanlarsen1 hour ago
      And as the lunar new year demo dance shows, China is leaving them in the dust building humanoid robots.
    • m4631 hour ago
      My question too.<p>though they did update the model y (looks like a duck), they just cancelled the model S and X
    • WarmWash38 minutes ago
      Optimus robots!<p>In 2 years Tesla will be replacing most factory workers with fully autonomous robots that will do most of the work. This will generate trillions in revenue and is totally definitely trust me bro possible.<p>Expect huge updates on this coming in the near future, soon. Tesla will be the most valuable company on Earth. Get in the stock now.<p>(cars, solar panels, energy storage, and robotaxis are no longer part of the roadmap because optimus bots will bring in so much money in 2 years definitiely that these things won&#x27;t matter so don&#x27;t ask about them or think about them thanks.)
    • mr_00ff001 hour ago
      Yet Tesla is trading near its all time high.
  • blinding-streak1 hour ago
    Unsafe at any speed.
    • 1970-01-011 hour ago
      Too true. Just a few days ago it was determined it was unsafe at 0 speed. I&#x27;m not joking: <a href="https:&#x2F;&#x2F;electrek.co&#x2F;2026&#x2F;02&#x2F;17&#x2F;tesla-robotaxi-adds-5-more-crashes-austin-month-4x-worse-than-humans&#x2F;#:~:text=Jul%202025-,0%20mph,-SUV" rel="nofollow">https:&#x2F;&#x2F;electrek.co&#x2F;2026&#x2F;02&#x2F;17&#x2F;tesla-robotaxi-adds-5-more-cr...</a>
  • tehjoker1 hour ago
    It&#x27;s crazy that they weren&#x27;t reeled in by a regulator and it had to make it all the way through the court system. People are dead. A court judgement can&#x27;t change that. Preemptive action would have.
  • palmotea1 hour ago
    &gt; Tesla also claimed that references to CEO Elon Musk’s statements about Autopilot during the trial misled the jury....<p>&gt; The company essentially argued that references to Elon Musk’s own public claims about Autopilot, claims that Tesla actively used to sell the feature for years, were somehow unfair to present to a jury. Judge Bloom was right to reject that argument.<p>Of course, since Elon Musk has lied and over-promised a lot about Tesla&#x27;s self-driving technology. It&#x27;s an interesting defense to admit your CEO is a lair and can&#x27;t be trusted.
  • wget021 hour ago
    [flagged]
  • WalterBright1 hour ago
    [flagged]
    • palmotea1 hour ago
      &gt; It&#x27;s an absurd judgement.<p>&gt; Consider also how many lives have been saved by the autopilot.<p>&gt; Be careful what you wish for.<p>How many? Tell me.
      • breakyerself1 hour ago
        It saved my life. I was standing on the Golden gate bridge looking over the edge. A Tesla model 3 pulled over and started playing baby I need your lovin at full volume. I cried and climbed into the car and it turned on the seat heater.
      • 1970-01-011 hour ago
        There&#x27;s legit dashcam video showing Autopilot preventing severe crashes. Go on YouTube instead of balking. Here&#x27;s a few to get your algorithm working:<p><a href="https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=A3K410O_9Nc" rel="nofollow">https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=A3K410O_9Nc</a><p><a href="https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=Qy6SplEn4hQ" rel="nofollow">https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=Qy6SplEn4hQ</a><p><a href="https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=GcfgIltPyOA" rel="nofollow">https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=GcfgIltPyOA</a><p><a href="https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=Tu2N8f3nEYc" rel="nofollow">https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=Tu2N8f3nEYc</a>
        • palmotea53 minutes ago
          &gt; There&#x27;s legit dashcam video showing Autopilot preventing severe crashes. Go on YouTube instead of balking. Here&#x27;s a few to get your algorithm working:<p>So? I just watched those. They don&#x27;t prove anything about &quot;lives [that] have been saved by the autopilot.&quot; They all look like scenarios a human driver could handle (and *I, personally have handled situations similar to some of those). If autopilot is saving lives, you have to show, statistically, it&#x27;s better than human drivers in comparable conditions.<p>Also the last one appears to be of a Tesla fanboy who had just left a Tesla shareholder meeting, and seems pretty biased. I&#x27;d say his Cybertruck actually reacted <i>pretty late</i> to the danger. It was pretty obvious from the dashcam that something was wrong several seconds before the car reacted to it <i>at the last second</i>.
          • ethmarks41 minutes ago
            I can&#x27;t speak for Tesla&#x27;s FSD specifically, but Waymo did a study on the collision rate of their autonomous cars compared to human drivers: <a href="https:&#x2F;&#x2F;waymo.com&#x2F;safety&#x2F;impact&#x2F;" rel="nofollow">https:&#x2F;&#x2F;waymo.com&#x2F;safety&#x2F;impact&#x2F;</a>. They found that Waymos get into about 81% fewer crashes per mile. Compared to a statistical human driver, Waymo prevented around 411 collisions that would have resulted in any injury, and 27 collisions that would have resulted in serious injury or death. It seems like for Waymo specifically, self-driving cars are demonstrably safer than human drivers. Not sure if that generalizes to Tesla FSD, though.
            • palmotea5 minutes ago
              &gt; I can&#x27;t speak for Tesla&#x27;s FSD specifically, but Waymo did a study on the collision rate of their autonomous cars compared to human drivers: <a href="https:&#x2F;&#x2F;waymo.com&#x2F;safety&#x2F;impact&#x2F;" rel="nofollow">https:&#x2F;&#x2F;waymo.com&#x2F;safety&#x2F;impact&#x2F;</a>. They found that Waymos get into about 81% fewer crashes per mile.<p>I think that&#x27;s true. Though I recall that Waymo limits their cars to safer and more easily handled conditions, which is totally the right thing to do, but it probably means that statistic needs an asterisk.<p>&gt; Not sure if that generalizes to Tesla FSD, though.<p>I don&#x27;t think it does. They&#x27;re two totally different systems.
          • 1970-01-0149 minutes ago
            The last link is literally a man stating he could not handle the situation without FSD driving him. You&#x27;re experiencing cognitive dissonance with the evidence.
            • palmotea43 minutes ago
              &gt; The last link is literally a man stating he could not handle the situation without FSD driving him. You&#x27;re experiencing cognitive dissonance with the evidence.<p>And I have doubts about that man&#x27;s reliability.
      • WalterBright1 hour ago
        Statistics aren&#x27;t collected on that. But I&#x27;ve read anecdotes where individuals recounted the autopilot saving them from a severe accident.<p>You can also google &quot;how many lives has tesla autopilot saved?&quot; and the results suggest that the autopilot is safer than human pilots.
        • triceratops40 minutes ago
          That doesn&#x27;t make any sense. If I, a human, hit the brakes in time and avoid an accident today, then hit someone tomorrow, should I not be held responsible for the second incident?
          • WalterBright28 minutes ago
            The point is to compare the consequences of more deaths by not using autopilot with deaths by the autopilot.<p>If you accidentally kill someone with your car, do you think you should have to pay $243m?
            • palmotea4 minutes ago
              &gt; If you accidentally kill someone with your car, do you think you should have to pay $243m?<p>If you have billions of dollars, and somehow can&#x27;t go to prison, yes you should. If not in compensation to the victim, then in some kind of fine scaled to wealth. The amount paid needs to be high enough to be a deterrent to the individual who did wrong.
  • DoesntMatter221 hour ago
    This will continually be appealed until it’s reduced.
    • DannyBee1 hour ago
      They claim have a pretrial agreement to reduce it to 3x compensatory damages (which would make the total judgemnet 160 million instead of 243 million).<p>Appealing is expensive because they have to post a bond with 100% collateral, and you pay for it yearly. In this case, probably around 8 million a year.<p>So in <i>general</i> its not worth appealing for 5 years unless they think they will knock off 25-30% of the judgement.<p>Here it&#x27;s the first case of it&#x27;s kind so i&#x27;m sure they will appeal, but if they lose those appeals, most companies that aren&#x27;t insane would cut their losses instead of trying to fight everything.
    • LeoPanthera1 hour ago
      This was the appeal.
      • DannyBee1 hour ago
        No it wasn&#x27;t, it was a motion to set aside the verdict, made before the trial judge.<p>The appeal will go to the 11th circuit.
      • tiahura1 hour ago
        No it wasn&#x27;t. This was the trial judge deciding to not reduce it. $43 million in compensatory damages is unusually high for a wrongful death.
        • jeffbee1 hour ago
          $43 millon does not seem spectacularly high compensation for killing someone at the age of 22.
          • dekhn1 hour ago
            When my spouse worked in the area of determining &quot;the value of an individual&quot; (economically, not morally), it was computed as present value lifetime earnings: the cumulative income of the individual, converted back to its current value (using some sort of inflation model). IIRC, the PVLE averaged out to about $1-10M.
            • nradov1 hour ago
              You shouldn&#x27;t be down voted. Regardless of the moral or technical issues involved, there are established formulas used to calculate damages in wrongful death civil suits. Your range is generally correct although certain factors can push it higher. (Punitive damages are a separate issue.)
              • dekhn55 minutes ago
                I generally don&#x27;t complain about being downvoted, but it is always puzzling when I post a neutral fact without any judgement.
              • jeffbee40 minutes ago
                There are not &quot;established formulas&quot; or, to the extent that they are, the coefficients and exponents are not determined. The parties always argue about the discount rates and whatnot.
                • dekhn29 minutes ago
                  Sure, no argument there, I was just referring to research like this: <a href="https:&#x2F;&#x2F;escholarship.org&#x2F;uc&#x2F;item&#x2F;82d0550k" rel="nofollow">https:&#x2F;&#x2F;escholarship.org&#x2F;uc&#x2F;item&#x2F;82d0550k</a><p>&quot;&quot;&quot;Results. At a discount rate of 3 percent, males and females aged 20-24 have the highest PVLE — $1,517,045 and $1,085,188 respectively. Lifetime earnings for men are higher than for women. Higher discount rates yield lower values at all ages.&quot;&quot;&quot;
    • Hamuko46 minutes ago
      Does the American legal system have infinite appeals?