Tesla driver who killed 2 people while using autopilot must pay $23,000 in restitution without having to serve any jail time::The case is believed to be the first time that U.S. prosecutors have brought felony charges against a motorist who was using a partially automated driving system.

    • jonne@infosec.pub
      link
      fedilink
      English
      arrow-up
      34
      arrow-down
      1
      ·
      11 months ago

      Yeah, judging by the article, Tesla should take some responsibility here. Not that the driver should get off, if your car is blowing a red light at 120km/h you’re just not paying proper attention.

      • HerrBeter@lemmy.world
        link
        fedilink
        English
        arrow-up
        12
        ·
        11 months ago

        Sure, I’d prefer to know more exactly the time between. Was it 2 seconds or 25? But my premise is this shouldn’t happen in the software. I know I read some time ago that Teslas had shut off the software moments before collision, no time to save it, but I’d have to double check that. All to blame the customer

        Automakers should not be allowed to use the unsuspecting public as toys for their experimental software, it quickly becomes a 1-4 ton death machine, but I think we agree on that.

        • jonne@infosec.pub
          link
          fedilink
          English
          arrow-up
          7
          arrow-down
          1
          ·
          11 months ago

          Oh yeah, I work in software development myself. No way I’d trust my life to something like Tesla’s autopilot, which is perpetually in beta, relies on just the camera feed and is basically run by a manager that has clear issues with over promising and under delivering (among other things). You can get away with shit like that for a website or mobile app, but these are people’s lives.

        • HerrBeter@lemmy.world
          link
          fedilink
          English
          arrow-up
          42
          arrow-down
          1
          ·
          11 months ago

          Auto manufacturers must be held liable for faulty software. If it’s not safe, it does not go on the road

          • RushingSquirrel@lemm.ee
            link
            fedilink
            English
            arrow-up
            9
            arrow-down
            8
            ·
            11 months ago

            Only if the software is causing the accident or preventing the driver from avoiding one. Here the fault of the software was to not slow down out of the highway (which by experience must be a very specific situation because it most certainly do), the drive could have disengage autopilot or applied brakes to stop at the red light. The software specifically mentions it can’t stop at red lights and alerts the driver when it’s about to burn one. 100% of fault is the driver here.

          • AnneBonny@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            7
            ·
            11 months ago

            Are manufacturers solely responsible for safety, or lack thereof, on the public roads of the USA?

            I don’t believe they are.

            • FrostyTheDoo@lemmy.world
              link
              fedilink
              English
              arrow-up
              13
              ·
              edit-2
              11 months ago

              Solely? No. But if the airbag, seatbelt, or self-driving autopilot feature that they created contributed to someone’s death, they are partially responsible and should face consequences or punishments. Especially if they market it as a safe feature.