232807


Driverless moral dilemma

Imagine you're behind the wheel when your brakes fail. As you speed toward a crowded crosswalk, you're confronted with an impossible choice: veer right and mow down a large group of elderly people or veer left into a woman pushing a stroller.

Now imagine you're riding in the back of a self-driving car. How would it decide?

Researchers at the Massachusetts Institute of Technology are asking people worldwide how they think a robot car should handle such life-or-death decisions. Their findings so far show people prefer a self-driving car to act in the greater good, sacrificing its passenger if it can save a crowd of pedestrians. They just don't want to get into that car.

The findings present a dilemma for car makers and governments eager to introduce self-driving vehicles on the promise that they'll be safer than human-controlled cars.

"There is a real risk that if we don't understand those psychological barriers and address them through regulation and public outreach, we may undermine the entire enterprise," said Iyad Rahwan, an associate professor at the MIT Media Lab. "People will say they're not comfortable with this. It would stifle what I think will be a very good thing for humanity."

After publishing research last year surveying U.S. residents, Rahwan and colleagues at the University of Toulouse in France and the University of California, Irvine, are expanding their surveys and looking at how responses vary in different countries.

They also are using a website created by MIT researchers called the Moral Machine , which allows people to play the role of judging who lives or dies. A jaywalking person or several dogs riding in the driverless car? A pregnant woman or a homeless man?

Preliminary, unpublished research based on millions of responses from more than 160 countries shows broad differences between East and West. More prominent in the United States and Europe are judgments that reflect minimizing the total harm, Rahwan said.

But to those focused on how the vehicles act in ordinary situations, the research scenarios are too unrealistic.

Just 5 miles from the lab in Cambridge, the first self-driving car to roll out on Massachusetts public roads began testing this month in Boston's Seaport District.

"We approach the problem from a bit more of a practical, engineering perspective," said NuTonomy CEO Karl Iagnemma, whose Cambridge-based company has also piloted self-driving taxis in Singapore.

Iagnemma said the study's moral dilemmas are "vanishingly rare." Designing a safe vehicle, not a "sophisticated ethical creature," is the focus of his engineering team as they tweak the software that guides their electric Renault Zoe past Boston snowbanks.

"When a driverless car looks out on the world, it's not able to distinguish the age of a pedestrian or the number of occupants in a car," Iagnemma said. "Even if we wanted to imbue an autonomous vehicle with an ethical engine, we don't have the technical capability today to do so."



More Business News



232708
223179
Data from CryptoCompare
RECENT STORIES
231492
232548
Castanet Proud Member of RTNDA Canada
Press Room