• lud@lemm.ee
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      7
      ·
      10 months ago

      It’s consentual if you buy it though.

      Calling it a war crime is slightly extreme.

      • spudwart@spudwart.com
        link
        fedilink
        English
        arrow-up
        13
        ·
        10 months ago

        Except the other drivers on the road aren’t all in Teslas, yet they are non-consentually and possibly even unknowingly a part of this experiment.

      • there1snospoon@ttrpg.network
        link
        fedilink
        English
        arrow-up
        6
        ·
        10 months ago

        If you hit another motorist or pedestrian, it’s no longer consensual.

        War crime is a tad much sure. Let’s just make it a felony.

  • Poggervania@kbin.social
    link
    fedilink
    arrow-up
    4
    arrow-down
    1
    ·
    10 months ago

    It’s funny how some of Elongated Muskrat’s testing and experiments involve the subjects dying.

    Monkeys dying with the Neuralink experiments, and humans are dying with these autopilot tests!

  • rsuri@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    arrow-down
    1
    ·
    edit-2
    10 months ago

    Random question I’ve always wondered about in case anyone is more familiar with these systems than me. My understanding is that autopilot relies on optical sensors exclusively. And image recognition tends to rely on getting loads of data to recognize particular objects. But what if there’s an object not in the training data, like a boulder in a weird shape? Can autopilot tell anything is there at all?