A Tesla was in its self-driving mode when it crashed into a parked patrol vehicle responding to a fatal crash in Orange County Thursday morning, police said.

The officer was on traffic control duty blocking Orangethorpe Avenue in Fullerton for an investigation into a suspected DUI crash that left a motorcyclist dead around 9 p.m. Wednesday when his vehicle was struck.

A Fullerton Police Department spokesperson said the officer was standing outside his vehicle around midnight when he saw a Tesla driving in his direction and not slowing down.

  • Flying Squid@lemmy.world
    link
    fedilink
    arrow-up
    1
    ·
    2 months ago

    It really doesn’t help that the media isn’t putting “Self-Driving” Mode in quotes since it isn’t fucking self-driving.

      • Flying Squid@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        2 months ago

        Tesla calls it “Full Self Driving” and it’s a lie. So capitalize it and put it in quotes, rather than call it self-drive mode like that’s an actual thing.

        • icy_mal@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          2 months ago

          The actual name: Full self driving (supervised) is so shady. Supervised is just a less crappy sounding way to indicate that you will have to take over and drive sometimes. So sometimes the car drives itself and sometimes you drive. So partial self driving, partial human driving. I’m surprised they didn’t call it “Partial Full Self Driving”. That would certainly amp up the trolling factor and really separate the true believers who would come out defending it with Olympic level mental gymnastics.

  • BigMacHole@lemm.ee
    link
    fedilink
    arrow-up
    0
    ·
    2 months ago

    That must have been SO scary for the cop! He wouldn’t know whether to shoot the car or the passenger!

  • NameTaken@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    2 months ago

    Ugh I know people feel strongly about FSD and Tesla. As some one who uses it ( and still pays attention hands on wheels when activated) when FSD is active as soon as it sees anything resembling emergency lights it will beep and clearly disengage. I am not sure, but it’s possible this person probably is just using Tesla as a scape goat for their own poor driving. However in my experience it will force the driver to take control when emergency lights are recognized specifically to avoid instances like this.

    • Joelk111@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      2 months ago

      Doesn’t Tesla usually look at the logs for a situation like this, so we’ll know shortly?

    • vxx@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      2 months ago

      Thanks for the tip, going to flash my blue flashlight at teslas from now on.

      • NameTaken@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        2 months ago

        Yeah sure if that’s what makes you happy… 👍. Nothing like blinding random people in cars in your spare time.

        • vxx@lemmy.world
          link
          fedilink
          arrow-up
          0
          ·
          edit-2
          2 months ago

          No, not the driver, the faulty sensors and programming that should’ve never been approved for the road.

          • NameTaken@lemmy.world
            link
            fedilink
            arrow-up
            0
            ·
            2 months ago

            Wait so how is it faulty and bad programming if it disengages when emergency vehicles are present? You’d prefer it to stay on in emergency situations?

    • FireRetardant@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      2 months ago

      The victims involved in crashes aren’t always rich. People in other cars or pedestrians and cyclists can be injured by these mistakes.

  • IsThisAnAI@lemmy.world
    link
    fedilink
    arrow-up
    0
    arrow-down
    1
    ·
    2 months ago

    Jesus you Elon haters can’t help yourself.

    This 👏 isn’t 👏 news. It’s an obsession with a billionaire.

      • IsThisAnAI@lemmy.world
        link
        fedilink
        arrow-up
        0
        arrow-down
        1
        ·
        edit-2
        2 months ago

        Yes. It’s a single crash, no details, and happens every day with non assisted driving, LKA, ACC, etc. But since it’s a Tesla anti Elon zealots have to post every single rage bait article they can get their hands on.

        It’s an obsession and has nothing to do with technology.

        • Zorg@lemmy.blahaj.zone
          link
          fedilink
          arrow-up
          0
          ·
          edit-2
          2 months ago

          In 59 crashes examined by NHTSA, the agency found that Tesla drivers had enough time, “five or more seconds,” prior to crashing into another object in which to react. In 19 of those crashes, the hazard was visible for 10 or more seconds before the collision. Reviewing crash logs and data provided by Tesla, NHTSA found that drivers failed to brake or steer to avoid the hazard in a majority of the crashes analyzed.

          NHTSA also compared Tesla’s Level 2 (L2) automation features to products available in other companies’ vehicles. Unlike other systems, Autopilot would disengage rather than allow drivers to adjust their steering. This “discourages” drivers from staying involved in the task of driving, NHTSA said.
          “A comparison of Tesla’s design choices to those of L2 peers identified Tesla as an industry outlier in its approach to L2 technology by mismatching a weak driver engagement system with Autopilot’s permissive operating capabilities,” the agency said.
          Even the brand name “Autopilot” is misleading, NHTSA said, conjuring up the idea that drivers are not in control. While other companies use some version of “assist,” “sense,” or “team,” Tesla’s products lure drivers into thinking they are more capable than they are.
          https://www.theverge.com/2024/4/26/24141361/tesla-autopilot-fsd-nhtsa-investigation-report-crash-death

          It is not a single crash. There are assisted driving system out there using pupil tracking, to make sure drivers are still paying attention.
          Tesla’s solution is something along you need to be resting at least one hand on the steering wheel. And don’t get me started on how they are diluting the concept of “full self driving”…

          But yeah, you’re right, the only reason I’m sceptical of Tesla’s semi-self-driving tech; is because I think Elon is an egomaniac little bitch, who is incapable of ever admitting he was wrong in even the smallest way.

          • IsThisAnAI@lemmy.world
            link
            fedilink
            arrow-up
            0
            arrow-down
            1
            ·
            edit-2
            2 months ago

            It doesn’t increase the total volume of crashes per mile driven. Humans are shitty drivers, the bar is low. We’ve heard ad nauseum about the name of FSD. It’s a truth in advertising issue snd idiotic driver issue. Not safety.