You see a runaway trolley speeding down the tracks, about to hit and kill five people. You have access to a lever that could switch the trolley to a different track, where a different person would meet an untimely demise - ending one life to spare five. Would you pull the lever?
The Good Old Days
Not too long ago, tech enthusiasts were optimistic that by 2020, we would see self-driving cars revolutionising the way we travel, being the vehicle of choice, with some 10 million on the roads. That was a huge overestimation, with the actual number of vehicles in testing being a thousand times smaller, mostly being tested in very controlled conditions. Companies also believe that it may be better to make these special cars as human aids, rather than having complete autonomy.
But this slower development is not a bad thing either, as it gives necessary time to improve vehicle safety and engineers time to prepare for other threats, such as car hacking (in which these supposed useful tools could very well turn into very destructive weapons).
The Universal Ethics Dictionary
In addition, it also gives us a real chance to form some sort of social consensus on the ethics of autonomous vehicles, which would inevitably face decisions with moral implications. The programmers behind the cars will need to code the logic to make such decisions when certain conditions arise, and will need some justifiable basis to do so - just to make sure bias is not an issue. We have a lot to learn, making many mistakes along the way before we find acceptable solutions.
So far only one nation has laid out actual guidelines for how autonomous vehicles should make decisions - Germany. These rules have a strongly egalitarian basis:
“In the event of unavoidable accident situations, any distinction based on personal features (age, gender, physical or mental constitution) is strictly prohibited. It is also prohibited to offset victims against one another. General programming to reduce the number of personal injuries may be justifiable.”
It tries to avoid any possible weight of supremacy - male vs. female, old vs. young, a skilled surgeon vs. an infamous criminal. All people, in this view, count equally. Although this would seem the best in a very logical mindset, such notions could run up against cultural and political morals.
Well, What Do the People Think?
Researchers made a website to collect about 40 million choices involving theoretical self-driving dilemmas from people in 233 regions around the world, spanning many different cultures. Here’s a breakdown: They found that while people do generally prioritize human lives over animal lives, and they would like to save more rather than fewer lives, they also tend to prefer saving the young over the old. People from countries in Central and South America tended to prioritize the lives of females and the physically fit. In many regions, people also expressed a preference for high-status individuals — valuing an executive over a homeless individual.
Studies of this kind offer a rough guide to real moral preferences and how they vary from place to place, and trying to align with them might be a good starting point for engineers. Even so, surveys can't be the only guide, either, as prevailing moral attitudes change with time. Historically, in many places, explicitly racist or sexist values have held sway, despite widely being viewed as unethical by most people.
A better way to identify reliable rules, some experts argue, would be to combine the survey-based approach with analysis based on prevailing ethical theories developed by moral philosophers. One might start with public views but then put these through the filter of ethical theory to see if a rule is, on closer scrutiny, truly defensible. Ethicists refer to views that survive this test as “laundered preferences.” For example, all ethical theories would reject preferences for one gender over another, even though the survey found such preferences in some regions. In contrast, preferences to save the largest number of people would survive, as would a preference for the very young over the very old.
The following infographic details specific communal opinions:
Now it’s time to ask yourself:
Would You Turn The Lever?
Editor's Note
MIT's Moral Machine
“A platform for gathering a human perspective on moral decisions made by machine intelligence, such as self-driving cars. We generate moral dilemmas, where a driverless car must choose the lesser of two evils, such as killing two passengers or five pedestrians. As an outside observer, people judge which outcome they think is more acceptable. They can then see how their responses compare with other people. If they are feeling creative, people can also design their own scenarios, for others to view, share, and discuss.”
Check it out! → https://www.media.mit.edu/projects/moral-machine/overview/
Comments