SAFETYLIT WEEKLY UPDATE

We compile citations and summaries of about 400 new articles every week.
RSS Feed

HELP: Tutorials | FAQ
CONTACT US: Contact info

Search Results

Journal Article

Citation

Verberne FM, Ham J, Midden CJ. Hum. Factors 2015; 57(5): 895-909.

Affiliation

Eindhoven University of Technology, Eindhoven, Netherlands.

Copyright

(Copyright © 2015, Human Factors and Ergonomics Society, Publisher SAGE Publishing)

DOI

10.1177/0018720815580749

PMID

25921302

Abstract

OBJECTIVE: We examined whether participants would trust an agent that was similar to them more than an agent that was dissimilar to them.

BACKGROUND: Trust is an important psychological factor determining the acceptance of smart systems. Because smart systems tend to be treated like humans, and similarity has been shown to increase trust in humans, we expected that similarity would increase trust in a virtual agent.

METHODS: In a driving simulator experiment, participants (N = 111) were presented with a virtual agent that was either similar to them or not. This agent functioned as their virtual driver in a driving simulator, and trust in this agent was measured. Furthermore, we measured how trust changed with experience.

RESULTS: Prior to experiencing the agent, the similar agent was trusted more than the dissimilar agent. This effect was mediated by perceived similarity. After experiencing the agent, the similar agent was still trusted more than the dissimilar agent.

CONCLUSION: Just as similarity between humans increases trust in another human, similarity also increases trust in a virtual agent. When such an agent is presented as a virtual driver in a self-driving car, it could possibly enhance the trust people have in such a car. APPLICATION: Displaying a virtual driver that is similar to the human driver might increase trust in a self-driving car.


Language: en

NEW SEARCH


All SafetyLit records are available for automatic download to Zotero & Mendeley
Print