Tesla under investigation by California attorney general over Autopilot safety, marketing::The California attorney general is investigating Tesla over the electric car company’s driver assistance technology, CNBC has learned.

    • XTornado@lemmy.ml
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      2
      ·
      edit-2
      1 year ago

      About the pileup case, don’t take me wrong if the Tesla wouldn’t have stopped it wouldn’t have caused the accident, that I agree… and it’s terrible and should be disabled until these issues are solved.

      But at the end most of the pileup was caused by people being people. Like it could have been a normal car or even a Tesla that stopped for any good reason and the pileup would have happened anyway… The first cars stopped fine but then some other cars that didn’t keep a safe distance/ were going too fast or were distracted started crashing.

      • ghariksforge@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        1
        ·
        1 year ago

        You can see the pileup in the video. Tesla has a phantom break for no reason, and the cars driving behind slam into it.

        • shinjiikarus@mylem.eu
          link
          fedilink
          English
          arrow-up
          10
          arrow-down
          1
          ·
          1 year ago

          They wouldn’t have slammed into it, if they’d kept their safe distance as @[email protected] wrote. I’m in no way defending Tesla‘s „Autopilot“, it should be banned until they pass a very difficult test proving true self driving capabilities and multiple layers auf fail safes (which they can’t right now). But examples where an autopilot Tesla did something stupid and other people making human errors are disingenuous: if somebody drops their cigarette and breaks unexpectedly and the cars behind don’t keep their distance and slam into it, the reason they have an accident is not the cigarette but their dangerous safety distance.

          • jtk@lemmy.sdf.org
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            1 year ago

            It’s not just Tesla’s that have automatic breaking that can fail. Even basic cars have that “feature” now. I fucking hate it. My entire family was in the car and we almost got rear ended super hard because ours kicked in for the right reasons, but did it super far back from the incident and fast as hell. The person behind us had less than a second to react, luckily, there was no one in the next lane and they were able to swerve around. I’ve been terrified to drive it ever since.

            I agree no one should be tailgating but the algorithm needs to factor that in when it’s happening. I knew exactly how I wanted to handle the situation but the stupid car prevent me from doing it the safer way.

    • Imgonnatrythis@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      2
      ·
      1 year ago

      Drivers aren’t safe and they’re killing people. The FUD around this is hyperfocusing scrutiny. The real shame here is just leaving it up to Tesla. Governments should be throwing huge piles of money towards R&D, best practice standards, and breaking down barriers to testing and implementation. Thousands of lives are being lost because we are not moving fast enough with safe self driving tech. We have the ability to do this and it’s a shame it’s going so slowly at the cost of driver and pedestrian lives.

  • realbaconator@lemmy.world
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    2
    ·
    1 year ago

    The technology definitely still needs to be worked on, as it is, but I trust autopilot more than I trust the average commuter I have to deal with. And I deal with a lot.

    • Thorny_Thicket@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      5
      ·
      1 year ago

      Agreed. Anyone who claims FSD is worse driver than a human is just objectively wrong and doesn’t know what they’re talking about. Is it flawless? No. Is it better than the best human driver? Probably not. Is it miles ahead of average driver? Without a question.

      • stepintomydojo@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        4
        ·
        edit-2
        1 year ago

        Autopilot on freeways? Definitely better than the average driver. FSD on freeways? Same thing. I rely on those constantly, and also get frustrated when people complain about AP being unsafe.

        FSD on streets? Definitely still worse than the average driver, at least in places that don’t have perfectly laid out street grids and properly painted lane lines which is what I deal with. I can’t make it through a drive on streets without disengaging multiple times.

        The problem is these three different things get lumped together in conversations/articles all the time.

        • Thorny_Thicket@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          1 year ago

          Yeah the article seems to use autopilot and FSD interchangeably but I was talking about FSD and I expect everyone else to mean it aswell who’s talking about autopilot because the autopilot feature in itself is nothing special.

          I’m basing my opinion on the videos I’ve seen from AI DRIVR on youtube who really seems to put it in challenging spots but yeah I don’t have any first hand experience myself so I can’t argue against that

  • Haha@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    1
    ·
    1 year ago

    Is this place just about elon? There are other things happening too