It's frustrating to see that car companies see themselves as "partners" to the NTSB or other investigators.
Self-driving car companies should have only one obligation: provide all the relevant car crash data to the investigators, then step out of the way, and keep its mouth shut until the investigation concludes.
I've been warning since a couple of years ago that this is going to be an issue if not solved early on and that self-driving car companies need to be liable for car crashes (the one who's found at fault by investigators). But we haven't even established this.
Also, I think it's inevitable that all "self-driving" that isn't guaranteed by a government body to be Level 4 or higher should be banned from public roads. Sub-Level 4 self-driving is just not safe, and the very situations in which carmakers argue it would help reduce accidents are incompatible with these companies' own ToS.
For instance, let's say someone is drunk and has a Level 2 car (like a Tesla). According to Tesla, such a person should enable self-driving, as that will on average reduce the likelihood of crashes compared to if the drunk person drove themselves home.
But Tesla's own ToS says such a person would need to be constantly vigilant and with their hands always on the wheel, not fall asleep, and so on. How likely is that going to be with a drunk person? My guess is not very likely.
If the Autopilot malfunctioned like it did in the latest crash, that drunk person would die, because there's no way a drunk person could take the wheel and drive safely when even sober people can't seem to do it when they stop paying attention for a while.
> It's frustrating to see that car companies see themselves as "partners" to the NTSB or other investigators.
That is how the NTSB "party" system is ran[0]. Contrary to what is shown in various movies eg. Flight(2012), Sully (2016), an NTSB investigation is not an adversarial process, and is not a criminal investigation. Additionally, NTSB determined cause (an opinion by the board) cannot be used as evidence in civil litigation.
Asiana 214 [2] is a good example of the NTSB being somewhat divided in the opinion of the 777 autopilot having a design flaw vs "pilot human performance" as the cause of the crash.
IMHO both the autopilot design and the pilot performance were causal factors in the crash. It is probable the NTSB will determine both the Tesla driver and the autopilot were causal factors in this crash.
It's already illegal to drive a car drunk, including a Tesla with Autopilot engaged.
This doesn't stop drunk drivers from ending lives every day, including lives of other people who couldn't reasonably do anything to avoid it. We're just used to that. Self-driving cars promise a future of less death, though not complete elimination. It's a good future.
Self-driving car companies should have only one obligation: provide all the relevant car crash data to the investigators, then step out of the way, and keep its mouth shut until the investigation concludes.
I've been warning since a couple of years ago that this is going to be an issue if not solved early on and that self-driving car companies need to be liable for car crashes (the one who's found at fault by investigators). But we haven't even established this.
Also, I think it's inevitable that all "self-driving" that isn't guaranteed by a government body to be Level 4 or higher should be banned from public roads. Sub-Level 4 self-driving is just not safe, and the very situations in which carmakers argue it would help reduce accidents are incompatible with these companies' own ToS.
For instance, let's say someone is drunk and has a Level 2 car (like a Tesla). According to Tesla, such a person should enable self-driving, as that will on average reduce the likelihood of crashes compared to if the drunk person drove themselves home.
But Tesla's own ToS says such a person would need to be constantly vigilant and with their hands always on the wheel, not fall asleep, and so on. How likely is that going to be with a drunk person? My guess is not very likely.
If the Autopilot malfunctioned like it did in the latest crash, that drunk person would die, because there's no way a drunk person could take the wheel and drive safely when even sober people can't seem to do it when they stop paying attention for a while.