TL;DR: Self-Driving Teslas Rear-End Motorcyclists, Killing at Least 5

Brevity is the spirit of wit, and I am just not that witty. This is a long article, here is the gist of it:

  • The NHTSA’s self-driving crash data reveals that Tesla’s self-driving technology is, by far, the most dangerous for motorcyclists, with five fatal crashes that we know of.
  • This issue is unique to Tesla. Other self-driving manufacturers have logged zero motorcycle fatalities with the NHTSA in the same time frame.
  • The crashes are overwhelmingly Teslas rear-ending motorcyclists.

Read our full analysis as we go case-by-case and connect the heavily redacted government data to news reports and police documents.

Oh, and read our thoughts about what this means for the robotaxi launch that is slated for Austin in less than 60 days.

  • Buffalox@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    12 days ago

    Hey guys relax! It’s all part of the learning experience of Tesla FSD.
    Some of you may die, but that’s a sacrifice I’m willing to make.

    Regards
    Elon Musk
    CEO of Tesla

      • JasonDJ@lemmy.zip
        link
        fedilink
        English
        arrow-up
        0
        ·
        11 days ago

        News on the first mission: Meteoroid crashes into full flying SpaceX rocket, killing all aboard.

    • Gammelfisch@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      11 days ago

      +1 for you. However, replace “Regards” with the more appropriate words from the German language. The first with an S, and the second an H. I will not type that shit, fuck Leon and I hope the fucking Nazi owned Tesla factory outside of Berlin closes.

      • Buffalox@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        11 days ago

        Yes I’m not writing that shit, even in a sarcastic post. Bu I get your drift.
        On the other hand, since you are from Germany, VW group is absolutely killing it on EV recently IMO.
        They totally dominate top 10 EV here in Denmark, with 7 out of 10 top selling models!!
        They are competitively priced, and they are the best combination of quality and range in their price ranges.

    • WanderingThoughts@europe.pub
      link
      fedilink
      English
      arrow-up
      0
      ·
      11 days ago

      That’s why Tesla’s full self driving is officially still a level 2 cruise control. But of course they promise to jump directly to level 4 soon™.

  • NotMyOldRedditName@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    11 days ago

    For what it’s worth, it really isn’t clear if this is FSD or AP based on the constant mention of self driving even when it’s older collisions when it would definitely been AP, and is even listed as AP if you click on the links to the crash.

    So these may all be AP, or one or two might be FSD, it’s unclear.

    Every Tesla has AP as well, so the likelihood of that being the case is higher.

    • AA5B@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      11 days ago

      In this case, does it matter? Both are supposed to follow a vehicle at a safe distance

      I’d be more interested in how it changes over time, as new software is pushed. While it’s important that know it had problems judging distance to a motorcycle, it’s more important to know whether it still does

      • NotMyOldRedditName@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        11 days ago

        In this case, does it matter? Both are supposed to follow a vehicle at a safe distance

        I think it does matter, while both are supposed to follow at safe distances, the FSD stack is doing it in a completely different way. They haven’t really been making any major updates to AP for many years now, all focus has been on FSD. I think the only real changes it’s had for quite awhile have been around making sure people are paying attention better.

        AP is looking at the world frame by frame, each individual camera on it’s own, while FSD is taking the input of all cameras, turning into 3d vector space, and then driving based off that. Doing that on city streets and highways is only a pretty recent development. Updates for doing it this way on highway and streets only went out to all cars with FSD in the past few months. For a long time it was on city streets only.

        I’d be more interested in how it changes over time, as new software is pushed.

        I think that’s why it’s important to make a real distinction between AP and FSD today (and specifically which FSD versions)

        They’re wholly different systems, one that gets older every day, and one that keeps getting better every few months. Making an article like this that groups them together over the span of years muddies the water on what / if any progress has been made.

        • KayLeadfoot@fedia.ioOP
          link
          fedilink
          arrow-up
          0
          ·
          11 days ago

          Fair enough!

          At least one of the fatalities is Full-Self Driving (it was cited by name in the police reports). The remainder are Autopilot. So, both systems kill motorcyclists. Tesla requests this data redacted from their NHTSA reporting, which specifically makes it difficult for consumers to measure which system is safer or if incremental safety improvements are actually being made.

          You’re placing a lot if faith that the incremental updates are improvements without equivalent regressions. That data is specifically being concealed from you, and I think you should probably ask why. If there was good news behind those redactions, they wouldn’t be redactions.

          I didn’t publish the software version data point because I agree with AA5B, it doesn’t matter. I honestly don’t care how it works. I care that it works well enough to safely cohabit the road with my manual transmission cromagnon self.

          I’m not a “Tesla reporter,” I’m not trying to cover the incremental changes in their software versions. Plenty of Tesla fans doing that already. It only has my attention at all because it’s killing vulnerable road users, and for that analysis we don’t actually need to know which self-driving system version is killing people, just the make of car it is installed on.

          • NotMyOldRedditName@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            11 days ago

            I’d say it’s a pretty important distinction to know if one or both systems have a problem and the level of how bad that problem is.

            Also are you referencing the one in Seattle in 2024 for FSD? The CNBC article says FSD, but the driver said AP.

            And especially back then, there’s also an important distinction of how they work.

            FSD on highways wasn’t released until November 2024, and even then not everyone got it right away. So even if FSD was enabled, the crash may have been under AP.

            Edit: Also if it was FSD for real (that 2024 crash would have had to happen on city streets, not a highway) then thats 1 motorcycle fatality in 3.6 billion miles. The other 4 happened over 10 billion miles. Is that not an improvement? (edit again: I should say we can’t tell it’s an improvement yet as we’d have to pass 5 billion, so the jury is still out I guess IF that crash was really on FSD)

            Edit: I will cede though that as a motorcyclist, you can’t know what the Tesla is using, so you’d have to assume the worst.

            Edit: Just correcting myself that I was wrong about FSD in 2024. The change over to neural nets happened in November, but FSD was still FSD on highways when this accident happened. It was even earlier than that when FSD became AP when you transitioned to higways

            • KayLeadfoot@fedia.ioOP
              link
              fedilink
              arrow-up
              0
              ·
              11 days ago

              Police report for 2024 case attached, it is also linked in the original article: https://www.opb.org/article/2025/01/15/tesla-may-face-less-accountability-for-crashes-under-trump/

              It was Full Self Driving, according to the police. They know because they downloaded the data off the vehicle’s computer. The motorcyclist was killed on a freeway merge ramp.

              All the rest is beyond my brief. Thought you might like the data to chew on, though.

              • NotMyOldRedditName@lemmy.world
                link
                fedilink
                English
                arrow-up
                0
                ·
                edit-2
                11 days ago

                The motorcyclist was killed on a freeway merge ramp.

                I’d say that means it’s a very good chance that yes, while FSD was enabled, the crash happened under the older AP mode of driving, as it wasn’t until November 2024 that it was moved over to the new FSD neural net driving code.. I was wrong here, it actually was FSD then, it just wasn’t end to end neural nets then like it is now.

                Also yikes… the report says the AEB kicked in, and the driver overrode it by pressing on the accelerator!

    • psivchaz@reddthat.com
      link
      fedilink
      English
      arrow-up
      0
      ·
      11 days ago

      That’s not good though, right? “We have the technology to save lives, it works on all of our cars, and we have the ability to push it to every car in the fleet. But these people haven’t paid extra for it, so…”

      • NotMyOldRedditName@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        11 days ago

        Well, only 1 or 2 of those were in a time frame where I’d consider FSD superior to AP, it’s a more recent development where that’s likely the case.

        But to your point, at some point I expect Tesla to use the FSD software for AP for the exact reasons you mentioned. My guess is they’d just do something like disable making left/right turns , so you wouldn’t be able to use it outside of straight stretches like AP today.

    • kameecoding@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      11 days ago

      Because muh freedum, EU are a bunch of commies for not allowing this awesome innovation on their roads

      (I fucking love living in the EU)

    • Bytemeister@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      11 days ago

      Because the march of technological advancement is inevitable?

      In light of recent (and let’s face it, long ago cases) Tesla’s “Full Self Driving” needs to be downgraded to level 2 at best.

      Level 2: Partial Automation

      The vehicle can handle both steering and acceleration/deceleration, but the driver must remain engaged and ready to take control.

      Pretty much the same level as other brands self driving feature.

      • AngryCommieKender@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        11 days ago

        The other brands, such as Audi and VW, work much better than Tesla’s system. Their LIDAR systems aren’t blinded by fog, and rain the way the Tesla is. Someone recently tested an Audi with its system against a Tesla with its system. The Tesla failed either 3/5 or 4/5 tests. The Audi passed 3/5 or 4/5. Neither system is perfect, but the one that doesn’t rely on just cameras is clearly superior.

        Edit: it was Mark Rober.

        https://youtu.be/IQJL3htsDyQ

        • Bytemeister@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          11 days ago

          It’s hard to tell, but from about 15 minutes of searching, I was unable to locate any consumer vehicles that include a LIDAR system. Lots of cars include RADAR, for object detection, even multiple RADAR systems for parking. There may be some which includes a TimeOfFlight sensor, which is like LIDAR, but static and lacks the resolution/fidelity. My Mach-E which has level 2 automation uses a combination of computer vision, RADAR and GPS. I was unable to locate a LIDAR sensor for the vehicle.

          The LIDAR system in Mark’s video is quite clearly a pre-production device that is not affiliated with the vehicle manufacturer it was being tested on.

          Adding, after more searching, it looks like the polestar 3, some trim levels of the Audi A8 and the Volvo EX90 include a LiDAR sensor. Curious to see how the consumer grade tech works out in real world.

          Please do not mistake this comment as “AI/computer vision” evangelisim. I currently have a car that uses those technologies for automation, and I would not and do not trust my life or anyone else’s to that system.

          • KayLeadfoot@fedia.ioOP
            link
            fedilink
            arrow-up
            0
            ·
            11 days ago

            Mercedes uses LiDAR. They also operate the sole Level 3 driver automation system in the USA. Two models only, the new S-Class and EQS sedans.

            Tesla alleges they’ll be Level 4+ in Austin in 60 days, and just skip Level 3 altogether. We’ll see.

            • Bytemeister@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              11 days ago

              Yeah, keep in mind that Elon couldn’t get level 3 working in a closed, pre-mapped circuit. The robotaxis were just remotely operated.

          • AngryCommieKender@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            11 days ago

            The way I understand it, is that Audi, Volvo, and VW have had the hardware in place for a few years. They are collecting real world data about how we drive before they allow the systems to be used at all. There are also legal issues with liability.

    • bluGill@fedia.io
      link
      fedilink
      arrow-up
      0
      ·
      11 days ago

      Humans are terrible drivers. The open question is are self driving cars overall safer than human driven cars. So far the only people talking either don’t have data, or have reason cherry pick only parts of the data that make self driving look good. This is the one exception where someone seemingly independent has done analysis - the question is are they unbiased, or are they cherry picking data to make self driving look bad (I’m not familiar with the source so I can’t answer that)

      Either way more study is needed.

      • KayLeadfoot@fedia.ioOP
        link
        fedilink
        arrow-up
        0
        ·
        11 days ago

        I am absolutely biased. It’s me, I’m the source :)

        I’m a motorcyclist, and I don’t want to die. Also just generally, motorcyclists deserve to get where they are going safely.

        I agree with you. Self-driving cars will overall greatly improve highway safety.

        I disagree with you when you suggest that pointing out flaws in the technology is evidence of bias, or “cherry picking to make self driving look bad.” I think we can improve on the technology by pointing out its systemic defects. If it hits motorcyclists, take it off the road, fix it, and then save lives by putting it back on the road.

        That’s the intention of the coverage, at least: I am hoping to apply pressure to improve rather than remove. Read my Waymo coverage, I’m actually a big automation enthusiast, because fewer crashes is a good thing.

        • bluGill@fedia.io
          link
          fedilink
          arrow-up
          0
          ·
          11 days ago

          I wasn’t trying to suggest that you are biased, only that I have no clue and so it is possible you are somehow unfairly doing something.

      • Rhaedas@fedia.io
        link
        fedilink
        arrow-up
        0
        ·
        11 days ago

        Humans are terrible. The human eyes and brain are good at detecting certain things though that allow a reaction where computer vision, especially only using one method of detection, fails often. There are times when an automated system will prevent a problem before a human could even see it. So far neither is the clear winner, human driving just has a legacy that automation has to beat by a great length and not just be good enough.

        On the topic of human drivers, I think most on the road drive reactively and not based on prediction and anticipation. Given the speed and possible detection methods, a well designed automated system should be excelling at this. It costs more and it more complex to design such a thing, so we’re getting the bare bones of the best minimum tech can give us right now, which again is not a replacement for all cases.

    • Not_mikey@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      0
      ·
      11 days ago

      Robots don’t get drunk, or distracted, or text, or speed…

      Anecdotally, I think the Waymos are more courteous than human drivers. Though waymo seems to be the best ones out so far, idk about the other services.

        • dogslayeggs@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          11 days ago

          They have remote drivers that CAN take control in very corner case situations that the software can’t handle. The vast majority of driving is don’t without humans in the loop.

          • NotMyOldRedditName@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            11 days ago

            They don’t even do that, according to Waymo’s claims.

            They can suggest what the car should do, but they aren’t actually doing it. The car is in complete control.

            Its a nuanced difference, but it is a difference. A Waymo employee never takes control of or operates the vehicle.

            • KayLeadfoot@fedia.ioOP
              link
              fedilink
              arrow-up
              0
              ·
              11 days ago

              Interesting! I did not know that - I assumed the teleoperators took direct control, but that makes much more sense for latency reasons (among others)

  • lnxtx (xe/xem/xyr)@feddit.nl
    link
    fedilink
    English
    arrow-up
    0
    ·
    11 days ago

    Stop dehumanizing drivers who killed people.
    Feature, wrongly called, Full Self-Driving, shall be supervised at any time.

    • Ulrich@feddit.org
      link
      fedilink
      English
      arrow-up
      0
      ·
      11 days ago

      I think it’s important to call out inattentive drivers while also calling out the systems and false advertising that may lead them to become less attentive.

      If these systems were marketed as “driver assistance systems” instead of “full self driving”, certainly more people would pay attention. The fact that they’ve been allowed to get away with this blatant false advertising is astonishing.

      They’re also obviously not adequately monitoring for driver attentiveness.

    • SouthEndSunset@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      11 days ago

      If you’re going to say your car has “full self driving”, it should have that, not “full self driving (but needs monitoring.)” or “full self driving (but it disconnects 2 seconds before impact.)”.

  • 0x0@programming.dev
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    11 days ago

    This is news? Fortnine talked about it two years ago.
    TL;DR Tesla removed LIDAR to save a buck and the cameras see two red dots that the 'puter thinks it’s a far away car at night when indeed it’s a close motorcycle.

    • LesserAbe@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      11 days ago

      It’s helpful to remember that not everyone has seen the same stories you have. If we want something to change, like regulators not allowing dangerous products, then raising public awareness is important. Expressing surprise that not everyone knows about something can be counterproductive.

      Going beyond that, wouldn’t the new information here be the statistics?

      • JordanZ@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        11 days ago

        My state allowed motorcycle filtering in 2019 (not the same as California’s lane splitting). They ran a study and found a ton of motorcyclists were being severely injured or killed while getting rear ended sitting at stop lights. Filtering allows them to move to the front of the traffic light while the light is red and traffic is stationary. Many people are super aggravated about it even though most of the world has been doing it basically forever.

      • bluGill@fedia.io
        link
        fedilink
        arrow-up
        0
        ·
        11 days ago

        like regulators not allowing dangerous products,

        I include human drivers in the list of dangerous products I don’t want allowed. The question is self driving safer overall (despite possible regressions like this). I don’t want regulators to pick favorites. I want them to find “the truth”

      • AA5B@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        11 days ago

        Why not? It’s got multiple cameras so could judge distances the same way humans do.

        However there have been both hardware and software updates since most of those, so the critical question is how much of a problem is it still? The article had no info or speculation on that

    • TexasDrunk@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      11 days ago

      I’m on mine far more often than I’m in a car. I think Tesla found out that I point and laugh at any cyber trucks I see at red lights while I’m out and is trying to kill me.

    • Psythik@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      11 days ago

      As someone who likes the open sky feeling, this is why I drive a convertible instead.

        • Excrubulent@slrpnk.net
          link
          fedilink
          English
          arrow-up
          0
          ·
          11 days ago

          I remember finding a motorcycle community on reddit that called themselves “squids” or “squiddies” or something like that.

          Their whole thing was putting road tyres on dirtbikes and riding urban environments like they were offroad obstacles. You know, ramping things, except on concrete.

          They loved to talk about how dumb & short-lived they were. I couldn’t ever find that group again, so maybe I misremembered the “squid” name, but I wanted to find them again, not to ever try it - fuck that - but because the bikes looked super cool. I just have a thing for gender-bent vehicles.

          • real_squids@sopuli.xyz
            link
            fedilink
            English
            arrow-up
            0
            ·
            11 days ago

            Calamari Racing Team. It’s mostly a counter-movement to r/Motorcycles, where most of the posters are seen as anti-fun. Their whole thing is that, not just a specific way to ride, they also have a legendary commenter that pays money for pics in full leather.

            • Excrubulent@slrpnk.net
              link
              fedilink
              English
              arrow-up
              0
              ·
              edit-2
              10 days ago

              That’s the one! Thanks, that was un-googleable for me.

              I guess the road-tyres-on-dirt-bikes thing was maybe a trend when I saw the sub.

        • KayLeadfoot@fedia.ioOP
          link
          fedilink
          arrow-up
          0
          ·
          11 days ago

          Bahaha, that one is new to me.

          Back when I worked on an ambulance, we called the no helmet guys organ donors.

          This comment was brought to you by PTSD, and has been redacted in a rare moment of sobriety.

          • mutual_ayed@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            0
            ·
            11 days ago

            I also rammed 10cc spikes at the back of the bus, the world needs organ donors and motorcycles provide a great service for that. Hope your EMT career was short lived but rewarding.

  • spacesatan@leminal.space
    link
    fedilink
    English
    arrow-up
    0
    ·
    11 days ago

    Unless it’s a higher rate than human drivers per mile or hours driven I do not care. Article doesn’t have those stats so it’s clickbait as far as I’m concerned

    • chetradley@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      11 days ago

      The fact that the other self driving brands logged zero motorcyclist fatalities means the technology exists to prevent more deaths. Tesla has chosen to allow more people to die in order to reduce cost. The families of those five dead motorcyclists certainly care.

    • AA5B@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      11 days ago

      Same goes for the other vehicles. They didn’t even try to cover miles driven and it’s quite likely Tesla has far more miles of self-driving than anyone else.

      I’d even go so far as to speculate the zero accidents of other self-driving vehicles could just be zero information because we don’t have enough information to call it zero

      • KayLeadfoot@fedia.ioOP
        link
        fedilink
        arrow-up
        0
        ·
        11 days ago

        No, the zero accidents for other self-driving vehicles is actually zero :) You may have heard of this little boutique automotive manufacturer, Ford Motor Company. They’re one of the primary competitors, and they are far above the mileage where you would expect a fatal accident if they were as safe as a human.

        Ford has reported self-driving crashes (many of them!). Just no fatal crashes involving motorcycles, because I guess they don’t fucking suck at making self-driving software.

        I linked the data, it’s all public governmental data, and only the Tesla crashes are heavily redacted. You could… IDK… read it, and then share your opinion about it?

        • AA5B@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          10 days ago

          And how did it compare self-driving time or miles? Because on the surface if Tesla is responsible for 5 such accidents and Ford zero, but Tesla has significantly more than five times the self-driving time or miles, then we just don’t have data yet …… and I see an announcement that Ford expects full self driving in 2026, so it can’t have been used much yet

    • KayLeadfoot@fedia.ioOP
      link
      fedilink
      arrow-up
      0
      ·
      11 days ago

      Thanks, 'Satan.

      Do you know the number of miles driven by Tesla’s self-driving tech? Because I don’t, Tesla won’t say, they’re a remarkably non-transparent company where their tech is concerned. Near as I can tell, nobody does (other than folks locked up tight with NDAs). If the ratio of accidents-per-mile-driven looked good, you know as a flat fact that Elon would be Tweeting all about it.

      Sorry you didn’t find the death of 5 Americans newsworthy. I’ll try harder for the next one.

  • Redex@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    10 days ago

    Cuz other self driving cars use LIDAR so it’s basically impossible for them to not realise that a bike is there.