• @Aurenkin@sh.itjust.works
      link
      fedilink
      English
      32 years ago

      Source: trust me bro.

      On a serious note though why would they do that? Pretty sure they are legally covered with all the warnings that you are responsible for your vehicle.

        • @Aurenkin@sh.itjust.works
          link
          fedilink
          English
          52 years ago

          Source: trust me bro again.

          You are meant to keep your hands on the wheel at all times and pay attention when using the system. There are multiple layers of warnings if you don’t and the system will eventually not allow you to activate it if you ignore the warnings. If you sit there and watch as autopilot drives you off a cliff, it’s your fault.

          Yes Elon has been dodgey as fuck with his timelines, taking people’s money and making great claims about the future capabilities of the system and is just an all around asshole but can we try and ground our criticisms in facts?

          There are plenty of things we can and should be critical of when it comes to Tesla and making things up just makes it easier for genuine criticisms to be dismissed.

          Apologies to you if you actually are making well backed claims, it’s just frustrating to see so much noise when it comes to Tesla and people often just throwing random bs out there.

            • @Aurenkin@sh.itjust.works
              link
              fedilink
              English
              22 years ago

              On that we can absolutely agree and I think scrutiny is definitely warranted with any new technology especially one which has such a huge profit motive. My issue in this case was with the original claim that the system intentionally disengages at the last minute for the purpose of avoiding liability for any crash. Big call.

              Anyway, I was probably overly sarcastic and flippant which doesn’t help my point so sorry for venting my frustrations like that. Hopefully these technologies get the scrutiny they deserve without hysteria any time there’s a crash that ‘possibly’ involved autopilot.

    • @Moonrise2473@feddit.it
      link
      fedilink
      English
      202 years ago

      They know, they just can’t tell the media while the investigation is ongoing.

      But if a vehicle crashed on a turning tractor trailer (they start to slow down and signal hundreds and hundreds of meters before turning) hitting on the side without signs of braking, it means the driver was sleeping while the cruise control was engaged (I don’t use their misleading marketing term for what is actually an advanced cruise control)

    • Sentrovasi
      link
      fedilink
      252 years ago

      I guess the difference is we expect humans to fuck up, but autonomous driving is meant to eventually be the thing that replaces that and stops us fucking up.

      • BoscoBear
        link
        fedilink
        English
        132 years ago

        It just needs to to be better than humans, not perfect. That would save lives.

        • Sentrovasi
          link
          fedilink
          32 years ago

          The scary thing to me is that humans are predictable, or at least, predictable in their unpredictableness.

          With AI, it’s a black box I don’t understand. When it suddenly crashes, I literally will have no idea why.

          • BoscoBear
            link
            fedilink
            English
            22 years ago

            You deal with a different sort of people than I do.

  • andyburke
    link
    fedilink
    52 years ago

    This same crash modality was one of the first autopilot crashes.

    When your autopilot will drive right into the side of a truck, people are right to question its safety.

  • @maporita@unilem.org
    link
    fedilink
    English
    32 years ago

    Quoting individual instances is not helpful. The more important question is whether their auto pilot makes driving safer on average. According to Tesla themselves…

    "the accident rate with Autopilot engaged is just a fraction of the industry average in the US.

    From their report:

    " Tesla vehicles with Autopilot engaged (mostly highway miles) had just 0.18 accidents per million miles driven, compared to the US vehicle average of 1.53 accidents per million miles.

    Teslas on non-highways with Full Self Driving (FSD) engaged had just 0.31 accidents per million miles representing an 80% reduction in accidents compared with the average vehicle. Tesla vehicles with no active safety triggered – neither Autopilot of FSD – had an accident rate of 0.68, less than half the total US vehicle fleet".

    I would really like to see independent verification of this.