• Diplomjodler@lemmy.world
    link
    fedilink
    English
    arrow-up
    27
    arrow-down
    6
    ·
    edit-2
    8 days ago

    Interesting video but he played a bit loose with the facts. He talked in great length about that car where a woman was killed by a self driving car, but left it a crucial detail: there was a safety driver on board who didn’t do her job. Doesn’t exactly help with credibility.

    • FireRetardant@lemmy.world
      link
      fedilink
      English
      arrow-up
      25
      arrow-down
      1
      ·
      edit-2
      8 days ago

      I don’t think missing out this fact is all that relevant to the overall point he was trying to make here. These companies are claiming this technology will be safe and it clearly isn’t. A safety driver’s reaction time is also likely less than a regular driver’s as they may not be as connected with the vehicle through the steering wheel and pedals while in self driving mode. The tech in the car may have also influenced the safety driver’s judgement about the situation.

      • pc486@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        3
        ·
        8 days ago

        Well, formerly operating companies. The Uber and Cruise examples stopped both of them dead. Uber left the business entirely and Cruise had its license to operate revoked.

        That’s just omitting info. There’s also straight up wrong stuff, like residents not wanting it. As crazy as it sounds, at least with SF, the residents’ reps wrote the regulation law and haven’t had a measure to reject self-driving cars (at least K passed). The majority want to see these cars. Also, Facebook dumped their move fast motto a decade ago because of how bad it was (self-harm problems).

        It’s unfortunate too. I like Jason’s rants, but it’s too distracting when he gets a quick google level of facts wrong.

        • FireRetardant@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          8 days ago

          Did the residents really have informed consent? Were the residents told there was a risk of being dragged under the cars when they agreed for these vehicles? Or how about the people with apartments near the honking parking lots, do they want that happening every night?

          • pc486@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            7 days ago

            The honking thing specifically is another skewed fact. The neighbors want the Waymos, they just had a hard time getting ahold of the right folks at Waymo. That includes Sophia Tung, the neighbor who set up the honking video stream that Jason used.

            As a local in the area, I can say for certain that the majority of SF wants the cars there. There’s more resistance further down the peninsula, but it’s intermixed with anti-taxi messaging. It’s hard to tell if it’s about the cars or about “those kind of people” having access to their city.

            San Francisco neighbors say repeated Waymo honking is keeping them up at night

            Christopher Cherry who lives in the building next door said he was “really excited” to have Waymo in the neighborhood, thinking it would bring more security and quiet to the area.

            The residents who spoke with NBC Bay Area said they are not opposed to having the Waymo cars nearby. But they say they want to see a more neighborly response from the new autonomous vehicle company on the block.

            “We love having them there, we just would like for them to stop honking their horn at four in the morning repeatedly,” Cherry said.

            San Francisco neighbors say Waymo honking continues, global audience follows along live

            The incidents were captured on resident Sophia Tung’s YouTube live stream

            Tung and many of her neighbors said that they are Waymo customers and actually like the Waymo technology. But what they don’t like is the repeated, overnight noise.

    • cmhe@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      8 days ago

      A lot of this is a game of probabilities, which I don’t really think we have.

      For instance if a normal human driver, without any automation, can prevent 80% of dangerous situations, but the automation can only prevent 50%, and in those situations the human savety driver can prevent only another 50%, because of inattention, this results in just 75% of dangerous situations prevented and the automation is worse.

      Maybe someone knows the real probabilities, I don’t.

    • WalrusDragonOnABike [they/them]@lemmy.today
      link
      fedilink
      English
      arrow-up
      2
      ·
      7 days ago

      That’s a real world example of what happens when you have self-driving (even in beta mode with some safety features turned off to prevent that from happening). Obviously the driver this manually override and stop the car before hitting the person. It doesn’t even need any searching: the story itself makes that clear if it mentions the car hit someone.