The Unclear Impact

https://fortune.com/2022/06/10/elon-musk-tesla-nhtsa-investigation-traffic-safety-autonomous-fsd-fatal-probe/

> On Thursday, NHTSA said it had discovered in 16 separate instances when this occurred that Autopilot “aborted vehicle control less than one second prior to the first impact,” suggesting the driver was not prepared to assume full control over the vehicle.

> CEO Elon Musk has often claimed that accidents cannot be the fault of the company, as data it extracted invariably showed Autopilot was not active in the moment of the collision.

Sigh. 🤦‍♀️

@rysiek I think, more interesting than the implications for the autopilot crash statistics themselves, this is an excellent case against self driving cars. Drivers must be liable for their faults in accidents, but when companies, that operate self driving cars, refuse and avoid liability then they can't be allowed to continue operating.

Of course the Apartheid Richie Rich goes with:

WeLl AcKsHuLlY tHe AuToPiLoT wAs TeChNiCaLlY oFf!!1!

@rysiek marketing vs what really Tesla vehicles really offer

@rysiek that's a prison sentence there

@cassidymcgurk I mean... I can see how "disengage in case data cannot be trusted" is a reasonable fail-safe. And so in these case it might have been that this is what happened, for whatever reason.

In the end, it is reasonable to hold the driver accountable. It's their car, it's their responsibility on the road.

It's still a horribly bad look for Tesla, even if the autopilot disengage thing was not implemented *on purpose* to avoid legal responsibility.

@rysiek Public road? Public code.

@rysiek seems to me that there's an overwhelming need (and public interest in the legal sense) to be able to inspect models, training data, and control software for autonomous vehicles running on public roads. what blind spots does the model have? are builds reproducible? verifyable? how secure is the supply chain (terrifying)?

@vt52 yes please, all of the above is spot-on.

@dentangle @rysiek Very sharp.

@rysiek Like the autopilot shouts "Jesus, take the wheel!" but there's only the driver.

@rysiek @cassidymcgurk even with that, the vehicle should always safely stop on disabling the guidance/control/safety system (or on handover to another one, have made it possible for the following system (e.g. human) to safely stop, like by knowing enough distance ahead to be safe), even in case of new data being obviously wrong as old data already needs to allow for a safe stop.

or at least that's what I’d expect from knowing other (non-road) safety systems …

@rysiek@mastodon.technology @cassidymcgurk@mastodon.ie

I would somewhat disagree on this "it is reasonable to hold the driver accountable."

you want to blame someone for something they would not know.

the systems should be designed to fail safely,
i might not exactly know precisely what that would mean but i would guess that it would be something like...

put on the 4 ways and
bring the vehicle to a stop.

or something like that.

@logan @rysiek @cassidymcgurk IIRC a few years ago a few of the major German caar manufacturers pronounced that they would assume full responsibility for driving errors of their (prospective) autonomously self-driving cars. In my eyes, that is what is necessary to build confidence, even if it takes much longer to get those cars on the road. >>

@logan @rysiek @cassidymcgurk Teslas "move fast and break things" approach is reckless and irresponsible. It takes a weird legal environment like the USA to have them *not* fined into oblivion for this.

@trisschen @rysiek @cassidymcgurk I think one of the tricks here is that the Auopilot is not classified as a safety(-critical) system, since it is "only driver-assistance"

@szakib @rysiek @cassidymcgurk I would still expect such deactivation-safety at such a system, including more basic lane assistants or distance assistants or stop-and-go assistants

heck thinking about it, I’d even expect from a speed-limiting system that it disengaging means no traction until controls are set to 0

@szakib @rysiek @cassidymcgurk I guess that's another symptom of motorcars being inherently unsafe by design

i.e. movement to brake meaning accelerate in a nearby control, brake releasing on its own when losing contact with its control, likelyhood of accelerating by falling on the control, likelihood of something blocking the brake controls, lack of emergency brake, common maneuvers in theory requiring a second person due to lack of sight, …

@trisschen @szakib @cassidymcgurk yeah, cars are not ready for regular people and everyday use, the technology needs to be developed way further. 😉

That said, Tesla is particularly bad even at learning from car industry's past mistakes. They just waltz into the space and decide to ignore decades of accrued knowledge.

Re: self-driving cars, accidents

@rysiek @cassidymcgurk what is weird is that how come disengaging not an admission that decisions made by the self-driving component up to that point have lead to an unsafe situtation? like we’d expect a human driver to drive defensively and not intentionally get into a dangerous situation

(test cases and even whole methods to generate such test cases have been proposed in academic literature as well as by some industry players, e.g., to make sure the system can reason about potential object obscured from its vision. so there’s at least an expectation for self-driving components to be programmed this way. nevertheless, tesla – and i presume other implementers – chose the easy way out and blame humans for the shortcomings of their systems)

replies
0
announces
1
likes
2

Re: self-driving cars, accidents

@szakib @trisschen @rysiek @cassidymcgurk given the stringent standards for certifying safety-critical systems (even in the automotive domain, where cost savings are otherwise foremost), this is not surprising: it’s highly unlikely one could devise a way to demonstrate the safety of an autopilot-like system with current system architectures

which, in a sane world, would mean deploying no such systems in production at all

Re: self-driving cars, accidents