29 Oct 18

When autonomous cars must decide over life and death

If a self-driving car must choose, which living being should it sacrifice in the case of an unavoidable accident? That was the theme of an online study conducted by MIT (Boston’s renowned technical university) through its Moral Machine platform. It gathered 40 million decisions in ten languages from millions of people in 233 countries and territories.

Respondents were given impossible ethical dilemmas in a number of situations involving a self-driving car with faulty brakes. Regardless the choice made, there would always be fatalities, either in the self-driving car or on the zebra crossing. However, the participants got the choice to “kill” victims taking into account age, social profile, health, sex and the fact that the pedestrians were crossing the road when the “Do not walk” sign was off or on.

In some cases, the car has to abruptly change its trajectory, in others it just goes straight on. A few situations make you choose between dogs and human beings.

Cultural and economic influence

Whether children should be saved at all costs, or even a doctor, because of his “social value”, largely depends on demographics. Some group-oriented cultures tend to save the older people rather than kids, for instance. Others abide by the rules and sacrifice pedestrians because they were crossing the street when they shouldn’t.

That makes it very difficult for OEMs and policy makers to distil universal rules. Still, MIT believes it is possible to draw up global moral preferences.  
The Moral Machine is still online and allows you to take the judging test yourself.



Authored by: Dieter Quartier