SAFETYLIT WEEKLY UPDATE

We compile citations and summaries of about 400 new articles every week.
RSS Feed

HELP: Tutorials | FAQ
CONTACT US: Contact info

Search Results

Journal Article

Citation

Frank DA, Chrysochou P, Mitkidis P, Ariely D. Sci. Rep. 2019; 9(1): e13080.

Affiliation

Center for Advanced Hindsight, Duke University, Durham, United States.

Copyright

(Copyright © 2019, Nature Publishing Group)

DOI

10.1038/s41598-019-49411-7

PMID

31511560

Abstract

The development of artificial intelligence has led researchers to study the ethical principles that should guide machine behavior. The challenge in building machine morality based on people's moral decisions, however, is accounting for the biases in human moral decision-making. In seven studies, this paper investigates how people's personal perspectives and decision-making modes affect their decisions in the moral dilemmas faced by autonomous vehicles. Moreover, it determines the variations in people's moral decisions that can be attributed to the situational factors of the dilemmas. The reported studies demonstrate that people's moral decisions, regardless of the presented dilemma, are biased by their decision-making mode and personal perspective. Under intuitive moral decisions, participants shift more towards a deontological doctrine by sacrificing the passenger instead of the pedestrian. In addition, once the personal perspective is made salient participants preserve the lives of that perspective, i.e. the passenger shifts towards sacrificing the pedestrian, and vice versa. These biases in people's moral decisions underline the social challenge in the design of a universal moral code for autonomous vehicles. We discuss the implications of our findings and provide directions for future research.


Language: en

NEW SEARCH


All SafetyLit records are available for automatic download to Zotero & Mendeley
Print