• @RobotToaster@mander.xyz
    link
    fedilink
    English
    2067 days ago

    That’s what happens when you have a reasonable sensor suite with LIDAR, instead of trying to rely entirely on cameras like Tesla does.

  • @Viri4thus@feddit.org
    link
    fedilink
    English
    1897 days ago

    Why are we still doing this? Just fucking invest in mass transit like metro, buses and metrobuses. Jesus

    Also, Note that this is based on waymo’s own assumptions, that’s like believing a 5070 gives you 4090 performance…

    • @RobotToaster@mander.xyz
      link
      fedilink
      English
      756 days ago

      That doesn’t solve the last mile problem, or transport for all the people who live outside of a few dense cities.

      • @Whooping_Seal@sh.itjust.works
        link
        fedilink
        English
        3
        edit-2
        6 days ago

        Frankly the best solution i have seen is always a combination of things. At least in the city I live in, people can take bikes on buses and trains, many people walk, and for trips that require trunk space (e.g furniture, DIY supplies etc) there is a Car sharing service that is cheaper than owning a car, or using ride share / taxi.

        I don’t think waymo is a better option than a combination of what’s above, I think it can perhaps compliment it but it should not be the sole last-kilometre solution.

        I would like to see waymo-like tech provide better public transit for the disabled. As of now, people in my city with disabilities can book special routes which are serviced by specialized buses/ taxis, and existing lines are all wheelchair accessible as well.

        Self driving cars give the opportunity for those people to have even more freedom in booking, since as of now they can’t do last minute booking for the custom routes. It wouldn’t really create a traffic problem and massively would increase quality of life for those who are sadly disadvantages in society

    • @pc486@sh.itjust.works
      link
      fedilink
      English
      216 days ago

      Why are we still doing this?

      Because there’s a lot of money in it. 10.3% of the US workforce works in transportation and warehousing. Trucking alone is the #4 spot in that sector (1.2 million jobs in heavy trucks and trailers). Couriers and delivery also ranks highly.

      The self-driving vehicles are targeting whole markets and the value of the industry is hard to underestimate. And yes, even transit is being targeted (and being implemented; see South Korea’s A21 line). There’s a lot of crossover with trucking and buses, not to mention that 42% of transit drivers are 55+ in age. Hiring for metro drivers is insanely hard right now.

      • @Viri4thus@feddit.org
        link
        fedilink
        English
        106 days ago

        Taking waymo’s numbers at face value they are almost 20x more dangerous than a professional truck driver in the EU. This is a personal convenience thing for wealthy people, that’s it. Fucking over jarvis and Mahmood so we can have fleets of automated ubers…

        • @pc486@sh.itjust.works
          link
          fedilink
          English
          16 days ago

          Uber had a net income of 9.86 billion dollars and spent 7.14 billion in operations in 2024. That’s a single transportation company. Do you really think Uber or anyone else is going to ignore researching the technology that could significantly reduce their billions in operations costs?

          I’m also not so sure that Europe is 20x safer than the US. A quick search pulled up the International Transport Form’s Road Safety Annual Report 2023 and their data disagrees. The US, even with its really poor showing in the general numbers, is safer than Poland and Czechia (Road fatalities per billion vehicle‑kilometres, 2021). I could see an argument for a 2x gap of Europe outdoing the US, but a 20x? Citation needed.

          • @dogslayeggs@lemmy.world
            link
            fedilink
            English
            26 days ago

            They’re not saying general road safety is 20x better. They’re comparing an automated car ONLY on surface streets with lights, intersections, pedestrians, dogs, left turns, etc… to a professional truck driver mostly on highway miles.

            • @pc486@sh.itjust.works
              link
              fedilink
              English
              16 days ago

              That’s fair. Comparing regular drivers doing typical city trips to commercial big rigs is a bit apples-and-oranges. I wonder how CDL data would compare when the self-driving semi-trucks start putting on miles. Aurora is about to launch in that exact space.

              • @dogslayeggs@lemmy.world
                link
                fedilink
                English
                16 days ago

                I’m honestly more scared of that. Professional CDL drivers are WAY better at driving than other people. But their trucks are way more dangerous and harder to handle. So putting driverless tech in that is going to be harder and more dangerous.

    • Waryle
      link
      fedilink
      English
      156 days ago

      So we can have autonomous metros, buses and taxis that allow people anywhere when they need it so they don’t rely on having a car?

        • Lv_InSaNe_vL
          link
          fedilink
          English
          36 days ago

          Where? I haven’t heard of any rail lines that don’t have a human operator onboard or somewhere in the loop?

          • ℍ𝕂-𝟞𝟝
            link
            fedilink
            English
            5
            edit-2
            6 days ago

            Budapest line M4 is fully automated, stations have some personnel but otherwise you can get on a train and look out straight ahead through the window, there is no cab.

            Trains drive themselves, but I imagine there must be some switchboard type of thing somewhere.

        • Waryle
          link
          fedilink
          English
          16 days ago

          Now let’s do intercity trains and tramways then

    • @gandalf_der_12te@discuss.tchncs.de
      link
      fedilink
      English
      55 days ago

      people in america don’t want to ride with public transport because they’re incredibly isolationistic and have a fear of other human beings; so they prefer to drive within “their own 4 walls”, in their own chassis. It’s really about psychology much more than practical feasibility.

  • Curious Canid
    link
    fedilink
    English
    1046 days ago

    This would be more impressive if Waymos were fully self-driving. They aren’t. They depend on remote “navigators” to make many of their most critical decisions. Those “navigators” may or may not be directly controlling the car, but things do not work without them.

    When we have automated cars that do not actually rely on human being we will have something to talk about.

    It’s also worth noting that the human “navigators” are almost always poorly paid workers in third-world countries. The system will only scale if there are enough desperate poor people. Otherwise it quickly become too expensive.

    • Flic
      link
      fedilink
      276 days ago

      @Curious_Canid @vegeta this is the case for the Amazon “just walk out” shops as well. Like Waymo they frame it as the humans “just doing the hard part” but who knows what “annotating” means in this context? And notably it’s clearly more expensive to run than they thought as they’ve decided to do Dash Carts instead which looks like it’s basically a portable self-service checkout. The customer does the checking. https://www.theverge.com/2024/4/17/24133029/amazon-just-walk-out-cashierless-ai-india

      • @SippyCup@feddit.nl
        link
        fedilink
        English
        116 days ago

        Back when I was a fabricator I made some of the critical components used in Amazon stores. Amazon was incredibly particular about every little detail, even on parts that didn’t call for tight tolerancing in any conceivable way. They, on several occasions, sent us one bad set of prints after another. Which we could only discover after completing a run of parts. We’re talking 20-30 thousand units that ended up being scrapped because of their shitty prints. Millions of dollars set on fire, basically.

        They became such a huge pain in the ass to work with we eliminated every single SKU they ordered from us.

        • Flic
          link
          fedilink
          76 days ago

          @SippyCup I have never heard a single good thing from anyone who works with or for them.

        • @lud@lemm.ee
          link
          fedilink
          English
          26 days ago

          Ordering components with unnecessarily small tolerances is stupid and a waste of money but of course they will complain if you can’t make the parts to the specifications.

          Why did you even take the order in the first place if you can’t manage to produce them to spec?

          • @Revan343@lemmy.ca
            link
            fedilink
            English
            136 days ago

            of course they will complain if you can’t make the parts to the specifications.

            Why did you even take the order in the first place if you can’t manage to produce them to spec?

            Where did they say anything about not being able to make the parts to spec?

          • @ubergeek@lemmy.today
            link
            fedilink
            English
            116 days ago

            Why did you even take the order in the first place if you can’t manage to produce them to spec?

            They were made to spec, but the specs were wrong.

    • @Krauerking@lemy.lol
      link
      fedilink
      English
      226 days ago

      Yeah we managed to just put the slave workers behind a further layer of obfuscation. Not just relegated to their own quarters or part of town but to a different city altogether or even continent.

      Tech dreams have become about a complete lack of humanity.

      • Curious Canid
        link
        fedilink
        English
        226 days ago

        I saw an article recently, I should remember where, about how modern “tech” seems to be focused on how to insert a profit-taking element between two existing components of a system that already works just fine without it.

    • @Yoga@lemmy.ca
      link
      fedilink
      English
      166 days ago

      The system will only scale if there are enough desperate poor people. Otherwise it quickly become too expensive.

      You can also get MMORPG players to do it for pennies per hour for in-game currency or membership. RuneScape players would gladly control 5 ‘autonomous’ cars if it meant that they could level up their farming level for free.

      The game is basically designed to be an incredibly time consuming skinner box that takes minimal skill and effort in order to maximize membership fees.

        • @Yoga@lemmy.ca
          link
          fedilink
          English
          66 days ago

          The human operators are there for when the AI gets softlocked in a situation where it doesn’t know what to do and just sits there, not for regular driving.

      • @Usernameblankface@lemmy.world
        link
        fedilink
        English
        36 days ago

        Packaging the job as a video game side quest is genius. Make so the gamer has to do several simulated runs before they connect to an actual car, and give in-game expensive consequences for messing it up

        • @Yoga@lemmy.ca
          link
          fedilink
          English
          26 days ago

          It doesn’t even need to be a side quest, just a second screen activity lol

          They’ll do it for pennies an hour for 12 hours a day.

    • Domi
      link
      fedilink
      English
      9
      edit-2
      6 days ago

      I thought the human operators only step in when the emergency button is pressed or when the car gets stuck?

      Do they actually get driven by people in normal operation?

      • Curious Canid
        link
        fedilink
        English
        86 days ago

        The claim is that the remote operators do not actually drive the cars. However, they do routinely “assist” the system, not just step in when there’s an emergency.

        • @xthexder@l.sw0.com
          link
          fedilink
          English
          56 days ago

          I think they’ve got 1 person watching dozens of cars though, it’s not 1 per car like if there was human drivers.

    • @Usernameblankface@lemmy.world
      link
      fedilink
      English
      56 days ago

      Has anyone found the places where the navigators work to see how it goes? Has a navigator shared their experience on the web somewhere?

      I am very curious as to what they are asked to do and for how many cars And for how much money

    • Dropper-Post
      link
      fedilink
      English
      56 days ago

      i knew it that AI is just some guy in india responding to my queries.

  • Chaotic Entropy
    link
    fedilink
    English
    426 days ago

    Considering the sort of driving issues and code violations I see on a daily basis, the standards for human drivers need raising. The issue is more lax humans than it is amazing robots.

      • @frezik@midwest.social
        link
        fedilink
        English
        55 days ago

        Raising the standards would result in 20-50% of the worst drivers being forced to do something else. If our infrastructure wasn’t so car-centric, that would be perfectly fine.

    • @littlebrother@lemm.ee
      link
      fedilink
      English
      46 days ago

      :Looks at entire midwest and southern usa:

      The bar is so low in these regions you need diamond drilling bits to go lower.

        • @_synack@sh.itjust.works
          link
          fedilink
          English
          35 days ago

          I have spent many years in both the midwest and the south.

          In some areas of the south, people drive extremely aggressively and there are lots of issues with compliance to various traffic laws but it is usually not difficult to get over if you need to. People will let you in. The zipper merge is a well-honed machine and almost everyone uses it and obeys it.

          In the midwest, drivers tend to me more docile, cautious, and lawful overall but have an extreme sense of entitlement over their place in line. “How dare that person use that completely empty lane to get ahead of me! Can they not see there is a line!” They will absolutely not let you in. It does not matter if the zipper merge would improve traffic flow. It just is not going to happen.

    • Terrasque
      link
      fedilink
      English
      16 days ago

      “You don’t have to be faster than the bear, you just have to be faster than the other guy”

  • @theluddite@lemmy.ml
    link
    fedilink
    English
    55
    edit-2
    7 days ago

    I am once again begging journalists to be more critical of tech companies.

    But as this happens, it’s crucial to keep the denominator in mind. Since 2020, Waymo has reported roughly 60 crashes serious enough to trigger an airbag or cause an injury. But those crashes occurred over more than 50 million miles of driverless operations. If you randomly selected 50 million miles of human driving—that’s roughly 70 lifetimes behind the wheel—you would likely see far more serious crashes than Waymo has experienced to date.

    […] Waymo knows exactly how many times its vehicles have crashed. What’s tricky is figuring out the appropriate human baseline, since human drivers don’t necessarily report every crash. Waymo has tried to address this by estimating human crash rates in its two biggest markets—Phoenix and San Francisco. Waymo’s analysis focused on the 44 million miles Waymo had driven in these cities through December, ignoring its smaller operations in Los Angeles and Austin.

    This is the wrong comparison. These are taxis, which means they’re driving taxi miles. They should be compared to taxis, not normal people who drive almost exclusively during their commutes (which is probably the most dangerous time to drive since it’s precisely when they’re all driving).

    We also need to know how often Waymo intervenes in the supposedly autonomous operations. The latest we have from this, which was leaked a while back, is that Cruise (different company) cars are actually less autonomous than taxis, and require >1 employee per car.

    edit: The leaked data on human interventions was from Cruise, not Waymo. I’m open to self-driving cars being safer than humans, but I don’t believe a fucking word from tech companies until there’s been an independent audit with full access to their facilities and data. So long as we rely on Waymo’s own publishing without knowing how the sausage is made, they can spin their data however they want.

    edit2: Updated to say that ournalists should be more critical in general, not just about tech companies.

    • @nondescripthandle@lemmy.dbzer0.com
      link
      fedilink
      English
      277 days ago

      Journalist aren’t even critical of police press releases anymore, most simply print whatever they’re told verbatim. It may as well just be advertisement.

      • @theluddite@lemmy.ml
        link
        fedilink
        English
        177 days ago

        I agree with you so strongly that I went ahead and updated my comment. The problem is general and out of control. Orwell said it best: “Journalism is printing something that someone does not want printed. Everything else is public relations.”

      • Komodo Rodeo
        link
        fedilink
        English
        76 days ago

        The meat of the true issue right here. Journalism and investigative journalism aren’t just dead, their corpses has been feeding a palm tree like a pod of beached whales for decades. It’s a bizarre state of affairs to read news coverage and come out the other side less informed, without reading literal disinformation. It somehow seems so much worse that they’re not just off-target, but that they don’t even understand why or how they’re fucking it up.

    • William
      link
      fedilink
      English
      127 days ago

      I was going to say they should only be comparing them under the same driving areas, since I know they aren’t allowed in many areas.

      But you’re right, it’s even tighter than that.

      • @theluddite@lemmy.ml
        link
        fedilink
        English
        97 days ago

        These articles frustrate the shit out of me. They accept both the company’s own framing and its selectively-released data at face value. If you get to pick your own framing and selectively release the data that suits you, you can justify anything.

    • Anthony
      link
      fedilink
      76 days ago

      @theluddite@lemmy.ml @vegeta@lemmy.world
      to amplify the previous point, taps the sign as Joseph Weizenbaum turns over in his grave

      A computer can never be held accountable

      Therefore a computer must never make a management decision

      tl;dr A driverless car cannot possibly be “better” at driving than a human driver. The comparison is a category error and therefore nonsensical; it’s also a distraction from important questions of morality and justice. More below.

      Numerically, it may some day be the case that driverless cars have fewer wrecks than cars driven by people.(1) Even so, it will never be the case that when a driverless car hits and kills a child the moral situation will be the same as when a human driver hits and kills a child. In the former case the liability for the death would be absorbed into a vast system of amoral actors with no individuals standing out as responsible. In effect we’d amortize and therefore minimize death with such a structure, making it sociopathic by nature and thereby adding another dimension of injustice to every community where it’s deployed.(2) Obviously we’ve continually done exactly this kind of thing since the rise of modern technological life, but it’s been sociopathic every time and we all suffer for it despite rampant narratives about “progress” etc.

      It will also never be the case that a driverless car can exercise the judgment humans have to decide whether one risk is more acceptable than another, and then be held to account for the consequences of their choice. This matters.

      Please (re-re-)read Weizenbaum’s book if you don’t understand why I can state these things with such unqualified confidence.

      Basically, we all know damn well that whenever driverless cars show some kind of numerical superiority to human drivers (3) and become widespread, every time one kills, let alone injures, a person no one will be held to account for it. Companies are angling to indemnify themselves from such liability, and even if they accept some of it no one is going to prison on a manslaughter charge if a driverless car kills a person. At that point it’s much more likely to be treated as an unavoidable act of nature no matter how hard the victim’s loved ones reject that framing. How high a body count do our capitalist systems need to register before we all internalize this basic fact of how they operate and stop apologizing for it?

      (1) Pop quiz! Which seedy robber baron has been loudly claiming for decades now that full self driving is only a few years away, and depends on people believing in that fantasy for at least part of his fortune? We should all read Wrong Way by Joanne McNeil to see the more likely trajectory of “driverless” or “self-driving” cars.
      (2) Knowing this, it is irresponsible to put these vehicles on the road, or for people with decision-making power to allow them on the road, until this new form of risk is understood and accepted by the community. Otherwise you’re forcing a community to suffer a new form of risk without consent and without even a mitigation plan, let alone a plan to compensate or otherwise make them whole for their new form of loss.
      (3) Incidentally, quantifying aspects of life and then using the numbers, instead of human judgement, to make decisions was a favorite mission of eugenicists, who stridently pushed statistics as the “right” way to reason to further their eugenic causes. Long before Zuckerberg’s hot or not experiment turned into Facebook, eugenicist Francis Galton was creeping around the neighborhoods of London with a clicker hidden in his pocket counting the “attractive” women in each, to identify “good” and “bad” breeding and inform decisions about who was “deserving” of a good life and who was not. Old habits die hard.

      • @dogslayeggs@lemmy.world
        link
        fedilink
        English
        16 days ago

        So let me make sure I understand your argument. Because nobody can be held liable for one hypothetical death of a child when an accident happens with a self driving car, we should ban them so that hundreds of real children can be killed instead. Is that what you are saying?

        As far as I know of, Waymo has only been involved in one fatality. The Waymo was sitting still at a red light in traffic when a speeding SUV (according to reports going at extreme rate of speed) rammed it from behind into other cars. The SUV then continued into traffic where it struck more cars, eventually killing someone. That’s the only fatal accident Waymo has been involved in after 50 million miles of driving. But instead of making it safer for children, you would prefer more kids die just so you have someone to blame?

        • Anthony
          link
          fedilink
          26 days ago

          @dogslayeggs@lemmy.world

          So let me make sure I understand your argument. Because nobody can be held liable for one hypothetical death of a child when an accident happens with a self driving car, we should ban them so that hundreds of real children can be killed instead. Is that what you are saying?

          No, this strawman is obviously not my argument. It’s curious you’re asking whether you understand, and then opining afterwards, rather than waiting for the clarification you suggest you’re seeking. When someone responds to a no-brainer suggestion, grounded in skepticism but perfectly sensible nevertheless, with a strawman seemingly crafted to discredit it, one has to wonder if that someone is writing in good faith. Are you?

          For anyone who is reading in good faith: we’re clearly not talking about one hypothetical death, since more than one real death involving driverless car technology has already occurred, and there is no doubt there will be more in the future given the nature of conducting a several-ton hunk of metal across public roads at speed.

          It should go without saying that hypothetical auto wreck fatalities occurring prior to the deployment of technology are not the fault of everyone who delayed the deployment of that technology, meaning in particular that these hypothetical deaths do not justify hastening deployment. This is a false conflation regardless of how many times Marc Andreesen and his apostles preach variations of it.

          Finally “ban”, or any other policy prescription for that matter, appeared nowhere in my post. That’s the invention of this strawman’s author (you can judge for yourself what the purpose of such an invention might be). What I urge is honestly attending to the serious and deadly important moral and justice questions surrounding the deployment of this class of technology before it is fully unleashed on the world, not after. Unless one is so full up with the holy fervor of technoutopianism that one’s rationality has taken leave, this should read as an anodyne and reasonable suggestion.

          • @dogslayeggs@lemmy.world
            link
            fedilink
            English
            16 days ago

            I was asking in good faith because the way you talk is not easily comprehensible. I can barely follow whatever argument you are trying to make. I think you are trying to say that we shouldn’t allow them on the road until we have fully decided who is at fault in an accident?

            Also, only one death has occurred so far involving driverless cars, which is where a speeding SUV rammed into a stopped driverless car and then the SUV continued on and hit 5 other cars where it killed someone. That’s it. The only death involved a driverless car sitting still, not moving, not doing anything… and it wasn’t even the car that hit the car in which the person died. So I would say it is hypothetical when talking about hypothetical deaths that are the fault of a driverless car.

      • @theluddite@lemmy.ml
        link
        fedilink
        English
        16 days ago

        Honestly I should just get that slide tattooed to my forehead next to a QR code to Weizenbaum’s book. It’d save me a lot of talking!

  • sunzu2
    link
    fedilink
    296 days ago

    But when it does crash, will Google accept the liability?

    • @Critical_Thinker@lemm.ee
      link
      fedilink
      English
      146 days ago

      I hate felon musk but I honestly believe their self driving tech is safer than humans.

      Have you seen the average human? They’re beyond dumb. If they’re in cars it’s like the majority of htem are just staring at their cell phones.

      I don’t think self driving tech works in all circumstances, but I bet it is already much better than humans at most driving, especially highway driving.

      • kingthrillgore
        link
        fedilink
        English
        46 days ago

        Bro I saw a video of their car drive through a wall and hand the controls back to the driver. No, it absolutely is not.

        • @Critical_Thinker@lemm.ee
          link
          fedilink
          English
          1
          edit-2
          5 days ago

          When was the last time you saw a “wall” erected on a freeway that was perfectly painted to mimic the current time of day, road, weather, etc. I’m not talking about for that example, i’m talking about in the real world.

          The answer is never.

          Yes, the optical sensors are fooled by an elaborate ruse that doesn’t exist in real world operating conditions on a highway.

          I still argue that for most normal driving circumstances, it is massively safer than humans who malfunction constantly.

          I will never, ever buy a tesla so long as felon musk has any ownership in it whatsoever. The guy is irredeemable. Still have way more faith in self driving tech overall (industry wide) than human drivers though. That’s the work of engineers, not an asshole.

      • socsa
        link
        fedilink
        English
        36 days ago

        Human drivers have an extremely long tail of idiocy. Most people are good (or at least appropriately cautious) drivers, but there is a very small percentage of people who are extremely aggressive and reckless. The fact that self driving tech is never emotional, reckless or impaired pretty much guarantees that it will always statistically beat humans, even in somewhat basic forms.

      • @cley_faye@lemmy.world
        link
        fedilink
        English
        26 days ago

        I honestly believe their self driving tech is safer than humans.

        That’s how it should be. Unfortunately, one of the main decision maker on tesla’s self driving software is doing their best to make it perform worse and worse every time it gets an update.

  • @kerrigan778@lemmy.world
    link
    fedilink
    English
    24
    edit-2
    6 days ago

    Unprofessional human drivers (yes, even you) are unbelievably bad at driving, it’s only a matter of time, but call me when you can do it without just moving labor done by decently paid locals to labor done remotely in the third world.

      • @Takumidesh@lemmy.world
        link
        fedilink
        English
        3
        edit-2
        6 days ago

        I find the scariest people on the road to be the arrogant ones that think they make no mistakes.

        I would t consider anyone who hasn’t done at least a dozen track days, experienced several different extreme scenarios (over/under steer, looping, wet grass at speed, airtime (or at least one or more wheels off the ground), high speed swerving, snap oversteer, losing systems, like brakes, engine, or the steering wheel lock engaging, etc) to be remotely prepared to handle a car going more than 25 or so mph. An extreme minority of drivers are actually prepared to handle an incoming collision in order to fully mitigate a situation. And that is only covering the mechanical skill of piloting the car, it doesn’t even touch in the theoretical and practical knowledge (rules of the road, including obscure and unenforced rules) and it definitely doesn’t even broach the discipline that is required to actually put it all together.

        If you a driver has never been trained, or even have an understanding of what will happen in an extreme scenario in a car, how could we consider them trained or sufficiently skilled.

        We don’t let pilots fly without spending time in a simulator, going over emergency scenarios and being prepared for when things go sideways. You can’t become an airline pilot if you don’t know what happens when you lose power.

        We let sub par people drive because restricting it too much would be seen as discrimination, but the overwhelming majority of people are ill equipped to actually drive.

        • @kameecoding@lemmy.world
          link
          fedilink
          English
          1
          edit-2
          6 days ago

          I hope this is a copy pasta lmao, if you actually go to a training course where you learn to handle oversteer, understeer and spin you out, they tell you that you have about a fuck all chance of recovering, even when there when you have warning and you know it’s coming and you have a fairly low speed you have very little chance of counter steering correctly.

          Here is what you actually have to do to drive safely:

          1, dont be a dumbass that thinks you need to go through 12 years of Formula 1 training to drive on the road, if anything the fact that you think training can make you prepared for extreme situations and that you can handle it is what’s arrogant and dangerous.

          2, dont be a dumbass and adjust your speed to driving conditions

          3 dont be a dumbass and don’t push the limits of your car on public roads

          4, defensive driving, assume people on the road are idiots and will fuck up and drive accordingly.

          5, learn how your car works, eg. just because you have an e-Handbrake you can still pull on it and it will stop the car

          6, and most important, because people don’t know how to do it, learn to emergency break, meaning your hazard lights come on.

          • @Takumidesh@lemmy.world
            link
            fedilink
            English
            1
            edit-2
            6 days ago

            I completely disagree.

            You are using the hand brake as an example. 95 percent of people (including you, evidently) don’t even understand that the handbrake is not an emergency brake, they don’t get how the behavior works, or the fact that it’s meant to be used as a parking brake, I consistently see people slam their parking pawls verytime they get out of their car. (Not to mention that it doesn’t even work while you are driving on most modern cars and has no modulation, as it’s just a button)

            If not being an idiot was good enough to drive a car, then it wouldn’t be so deadly. It’s also possible to fly a plane with common sense, but you wouldn’t be happy if your pilot told you they don’t have training.

            Driving isn’t easy, it’s just that we accept an absolutely catastrophic amount of accidents as a cost of doing business.

            • @kameecoding@lemmy.world
              link
              fedilink
              English
              1
              edit-2
              6 days ago

              It is an emergency brake when your brake fails, you donut. Again, it’s part of safety driving courses, that you clearly didn’t take.

              I am also from Europe, drivers are much better here compared to the US, just because your country absolutely sucks at training it’s drivers despite being entirely reliant on them is not my fault

  • @Imgonnatrythis@sh.itjust.works
    link
    fedilink
    English
    296 days ago

    No shit. The bar is low. Humans suck at driving. People love to throw FUD at automated driving, and it’s far from perfect, but the more we delay adoption the more lives are lost. Anti-automation on the roads is up there with anti-vaccine mentality in my mind. Fear and the incorrect assumption that “I’m not the problem, I’m a really good driver,” mentality will inevitably delay automation unnecessarily for years.

    • @Eczpurt@lemmy.world
      link
      fedilink
      English
      166 days ago

      It’d probably be better to put a lot of the R&D money into improving and reinforcing public transport systems. Taking cars off the road and separating cars from pedestrians makes a bigger difference than automating driving.

        • @Imgonnatrythis@sh.itjust.works
          link
          fedilink
          English
          26 days ago

          Sure that’s great, but read the room. It’s like advocating for gun legislation in the US, it can only go so far realistically. The vast majority of US cities are built around automotive infrastructure and the culture is very much anti-public transport. That requires heavy government level buy in. Car automation can be driven primarily by industry. One can happen in a major way in a few years, the other will take decades if it happens at all. Personally I’m all for it, but it’s such a different discussion that it just comes across as distracting when talking about very real delays in car automation and it’s not a valid criticism of moving forward and promoting decreased barriers to fully automated vehicle infrastructure.

    • @Obi@sopuli.xyz
      link
      fedilink
      English
      66 days ago

      That, and the inevitable bureaucratic nightmare that awaits for standardising across makes and updating the infrastructure.

  • @blazeknave@lemmy.world
    link
    fedilink
    English
    106 days ago

    I used to hate them for being slow and annoying. Now they drive like us and I hate them for being dicks. This morning, one of them made an insane move that only the worst Audi drivers in my area do, a massive left over a solid yellow across no stop sign with me coming right at it before it even began acceleration into the intersection.

  • @AA5B@lemmy.world
    link
    fedilink
    English
    106 days ago

    As a techno-optimist, I always expected self-driving to quickly become safer than human, at least in relatively controlled situations. However I’m at least as much a pessimist of human nature and the legal system.

    Given self-driving vehicles demonstrably safer than human, but not perfect, how can we get beyond humans taking advantage, and massive liability for the remaining accidents?