I think your statement and the fear for self driving can be true at the same time.
Self driving is safer than humans most of the time… but not all the time. Nothing is perfect.
Self driving currently assumes that a human can intervene when it fails. It assumes that a human is present and not eating a bowl of cereal and applying mascara. It assumes that the human is actually paying attention, in a situation where they usually don’t have to because self driving is usually safer.
Yes, self driving is statically safer. Yes, self driving will one day be perfect.
But I don’t think we can fault anyone for being worried about self driving, especially with companies like Tesla, who sell the promise that you don’t really have to pay attention… even though you kinda have to right now.
If we want to get really technical, the NSTB is requiring all new cars to have emergency braking so in this situation, the car should slam on the brakes. Even if it can’t slow down fast enough to prevent a crash, it should slow down enough to minimize it.
Is this particular Tesla under said law? Probably not. But I think we can see why this tactic is the infinitely safer and more ethical than saying “good luck, control this car on your own or enjoy this 100 km crash otherwise”