MIT asks humans who driverless cars should hit
How will self-driving cars handle moral dilemmas? The MIT Media Lab takes an unusual approach to find out.
Self-driving cars may be a proven success on the testing ground – yet one crunch question to determine their readiness for the real world remains unsolved:
How will autonomous cars handle moral dilemmas? What do they do if they have to choose between killing a passenger and harming a pedestrian? Slam into an elderly woman or run over a child, when seemingly there’s no other choice?
Unanswerable as these questions may seem, the MIT (Massachusetts Institute of Technology) Media Lab has launched a unique project to address them: in a survey on its Moral Machine platform, users can suggest how a driverless car should act in a (literally) no-brakes, life-or-death traffic scene. Also referred to as the trolley problem, the situation that respondents face is always the same: A driverless car races toward a crosswalk and has to “choose” whether to swerve and crash into a concrete block – sacrificing the drivers – or hit whoever’s crossing the road.
And this is where it gets tricky: Not only does the test tell users how many lives in the car or on the road are at stake – it also specifies if potential casualties are young or old, executives or homeless, an escaping robber, a law-abiding citizen, or day-dreaming jaywalkers. At the end of the 13-question binary mind game, Moral Machine provides partakers with a statistic of one’s moral compass: Is it better to save more lives than less? Must passengers be protected at any cost? It’s in your hands.
MIT’s way of gamifying complex moral quandaries may not solve the dilemma in that it would imbue self-driving tech with a sense of human judgment. And according to Quartz, cars or people will never be able to make split-second moral value decisions in such complex situations. But even if the test does not unravel the trolley problem for good, it sure raises further awareness of one of automated driving’s most sensitive challenges.
“The hardest part of the work we have to do is to develop technology that will work at that level of expected reliability,” Toyota’s Gill Pratt told Quartz earlier this year, adding: “It may be that what we need to do is to get society to understand that even though the overall statistics are getting tremendously better, that there are still occasions when the machine is going to be to blame, and we somehow need to accept that.”