Awad and his MIT colleagues gave millions of people the self-driving-car trolley problem. In a recent visit to KSJ, he talked about what he learned.
A self-driving car approaches a crosswalk and suddenly its brakes fail. It can avoid a person who’s crossing legally by swerving into the opposite lane, but then it will hit two people who are jaywalking. Anyone hit by the car will certainly die. What should it do?
This was just one of the moral dilemmas that Edmond Awad posed to KSJ fellows and guests at a seminar last week. Awad is one of the co-creators of the Moral Machine project, a website that asks these same questions of participants across the globe. It was developed by the Scalable Cooperation group, led by Iyad Rahwan at MIT’s Media Lab, where Awad has been a postdoc since 2015.
Launched in 2016, the Moral Machine presents website visitors with various versions of the so-called trolley problem. In some cases, the choice is between saving passengers or pedestrians. In others, participants must weigh personal characteristics: Should they save the male or the female? The old or the young? The person or the pet?
The Moral Machine has been translated into 10 languages. More than 4 million users from 233 countries and territories have made 40 million driving decisions on the site. In his seminar, Awad told the audience about the group’s findings, which were published in Nature last month.
The findings ranged from the unsurprising to the unsettling. In general, respondents preferred to spare as many lives as possible, to spare humans over pets, and to spare the lawful over the unlawful. But they also preferred to spare the young over the elderly and to save people of “higher status,” such as an executive, over people of “lower status,” such as a homeless person. Participants were more likely to save dogs than to save criminals — and were even less likely to save cats.
Based on a survey of the users, Awad and his colleagues were able to see how people of various ages and nationalities make decisions. They found differences, for instance, between regions of mostly individualist societies and those of mostly collectivist societies. “All countries preferred to spare younger [people] over older, but this difference was less pronounced in Eastern countries,” he said. He went on to add that “the stronger the rule of law is, the more likely countries are to spare the lawful.”
So why do all this? Awad said public opinions like those reflected in the Moral Machine project could influence how self-driving-car regulations are received in different societies. Algorithms that favor lawful pedestrians over jaywalkers might be more readily accepted in some countries than others, for instance.
But Awad doesn’t see the project as a roadmap for encoding morality into our cars. “We don’t believe [the users’ decisions] should be implemented,” he said. Instead, his team seeks to make people more aware of the moral dilemmas involved in programming self-driving cars. “We believe public engagement is a vital part of this issue,” he said.