SAFETYLIT WEEKLY UPDATE

We compile citations and summaries of about 400 new articles every week.
RSS Feed

HELP: Tutorials | FAQ
CONTACT US: Contact info

Search Results

Journal Article

Citation

Nees MA, Walker BN. Proc. Hum. Factors Ergon. Soc. Annu. Meet. 2008; 52(22): 1820-1824.

Copyright

(Copyright © 2008, Human Factors and Ergonomics Society, Publisher SAGE Publishing)

DOI

10.1177/154193120805202208

PMID

unavailable

Abstract

Interest in the use of sound as a means of information display in human-machine systems has surged in recent years. While researchers have begun to address issues surrounding good auditory display design as well as potential domains of application, little is known about the cognitive processes involved in interpreting auditory displays. In multi-tasking scenarios, dividing concurrent information display across modalities (e.g., vision and audition) may allow the human operator to receive (i.e., to sense and perceive) more information, yet higher-level conflicts in the encoding and representation of information may persist. Surprisingly few studies to date have examined auditory information display in dual-task scenarios. This study examined the flexibility of encoding of information and processing code conflicts in a dual-task paradigm with auditory graphs--a specific class of auditory displays that represent quantitative information with sound. Results showed that 1) patterns of dual-task interference were task dependent, and 2) a verbal interference task was relatively more disruptive to auditory graph performance than a visuospatial interference task, particularly for point estimation.


Language: en

NEW SEARCH


All SafetyLit records are available for automatic download to Zotero & Mendeley
Print