SAFETYLIT WEEKLY UPDATE

We compile citations and summaries of about 400 new articles every week.
RSS Feed

HELP: Tutorials | FAQ
CONTACT US: Contact info

Search Results

Journal Article

Citation

Wickens CD, Clegg BA, Vieane AZ, Sebok AL. Hum. Factors 2015; 57(5): 728-739.

Affiliation

Alion Science and Technology, Boulder, Colorado.

Copyright

(Copyright © 2015, Human Factors and Ergonomics Society, Publisher SAGE Publishing)

DOI

10.1177/0018720815581940

PMID

25886768

Abstract

OBJECTIVE: We examine the effects of two different kinds of decision-aiding automation errors on human-automation interaction (HAI), occurring at the first failure following repeated exposure to correctly functioning automation. The two errors are incorrect advice, triggering the automation bias, and missing advice, reflecting complacency.

BACKGROUND: Contrasts between analogous automation errors in alerting systems, rather than decision aiding, have revealed that alerting false alarms are more problematic to HAI than alerting misses are. Prior research in decision aiding, although contrasting the two aiding errors (incorrect vs. missing), has confounded error expectancy.

METHOD: Participants performed an environmental process control simulation with and without decision aiding. For those with the aid, automation dependence was created through several trials of perfect aiding performance, and an unexpected automation error was then imposed in which automation was either gone (one group) or wrong (a second group). A control group received no automation support.

RESULTS: The correct aid supported faster and more accurate diagnosis and lower workload. The aid failure degraded all three variables, but "automation wrong" had a much greater effect on accuracy, reflecting the automation bias, than did "automation gone," reflecting the impact of complacency. Some complacency was manifested for automation gone, by a longer latency and more modest reduction in accuracy.

CONCLUSIONS: Automation wrong, creating the automation bias, appears to be a more problematic form of automation error than automation gone, reflecting complacency. IMPLICATIONS: Decision-aiding automation should indicate its lower degree of confidence in uncertain environments to avoid the automation bias.


Language: en

NEW SEARCH


All SafetyLit records are available for automatic download to Zotero & Mendeley
Print