I’m curious, what’s the ratio of autopilot deaths to miles driven? Because as much as we rightfully talk shit about self driving cars, especially teslas, humans are notoriously fucking awful at driving cars, and even if the autopilot fucks up and kills people sometimes, it’s good as long as it kills less people per amount of driving.
But also train good car bad. :train-shining:
Not really, because at least the mistakes are at the hands of the people. Moving it over to machines will open the door for actuarial calculations and insurance companies controlling the ethical decisions an automated driver makes.
I’ll be honest I don’t super care about the ultimate decisions behind why people are being killed by cars, I just care that less people get killed by cars.