The article notes that self-driving cars will have to be programmed with some sort of "ethics". The big question will be what precise code of ethics to adopt:
How should the car be programmed to act in the event of an unavoidable accident? Should it minimize the loss of life, even if it means sacrificing the occupants, or should it protect the occupants at all costs? Should it choose between these extremes at random?
The answers to these ethical questions are important because they could have a big impact on the way self-driving cars are accepted in society. Who would buy a car programmed to sacrifice the owner?Money quote: "People are in favor of cars that sacrifice the occupant to save other lives -- as long they don’t have to drive one themselves."
(Related: "How to Help Self-Driving Cars Make Ethical Decisions.")