Craig Doty II, a Tesla owner, narrowly avoided a collision after his vehicle, in Full Self-Driving (FSD) mode, allegedly steered towards an oncoming train.

Nighttime dashcam footage from earlier this month in Ohio captured the harrowing scene: Doty’s Tesla rapidly approaching a train with no apparent deceleration. He insisted his Tesla was in Full Self-Driving mode when it barreled towards the train crossing without slowing down.

  • ElPenguin@lemmynsfw.com
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    13
    ·
    4 months ago

    As someone with more than a basic understanding of technology and how self driving works, I would think the end user would take special care driving in fog since the car relies on cameras to identify the roads and objects. This is clearly user error.

    • tb_@lemmy.world
      link
      fedilink
      English
      arrow-up
      21
      arrow-down
      1
      ·
      edit-2
      4 months ago

      This is clearly user error.

      When it’s been advertised to the user as “full self driving”, is it?

      Furthermore, the car can’t recognize the visibility is low and alert the user and/or refuse to go into self driving?

      • Maddier1993@programming.dev
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        1
        ·
        4 months ago

        When it’s been advertised to the user as “full self driving”, is it?

        I wouldn’t believe an advertisement.

        • tb_@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          ·
          edit-2
          4 months ago

          I wouldn’t trust Musk with my life either.

          But, presumably, we have moved beyond the age of advertising snake oil and miracle cures; advertisements have to be somewhat factual.

          If a user does as is advertised and something goes wrong I do believe it’s the advertiser who is liable.

          • 0x0@programming.dev
            link
            fedilink
            English
            arrow-up
            1
            ·
            4 months ago

            But, presumably, we have moved beyond the age of advertising snake oil and miracle cures and advertisements have to be somewhat factual.

            Keyword presumably.

            • tb_@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              4 months ago

              Right. But can you blame the user for trusting the advertisement?

              • 0x0@programming.dev
                link
                fedilink
                English
                arrow-up
                1
                ·
                4 months ago

                At the dealership? Kinda, yeah, it’s a dealership and news like this pop up every week.

                On the road? I wouldn’t trust my life to any self-driving in this day and age.

            • jaybone@lemmy.world
              cake
              link
              fedilink
              English
              arrow-up
              2
              ·
              4 months ago

              If the product doesn’t do what it says it does, that’s the product / manufacturers fault. Not the users fault. Wtf lol how is this even a debate.

      • darganon@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        4 months ago

        There are many quite loud alerts when FSD is active in subpar circumstances about how it is degraded, and the car will slow down. That video was pretty foggy, I’d say the dude wasn’t paying attention.

        I came up on a train Sunday evening in the dark, which I hadn’t had happen in FSD, so I decided to just hit the brakes. It saw the crossing arms as blinking stoplights, probably wouldn’t have stopped?

        Either way that dude was definitely not paying attention.

    • Noxy@yiffit.net
      link
      fedilink
      English
      arrow-up
      1
      ·
      4 months ago

      Leaving room for user error in this sort of situation is unacceptable at Tesla’s scale and with their engineering talent, as hamstrung as it is by their deranged leadership