SAFETYLIT WEEKLY UPDATE

We compile citations and summaries of about 400 new articles every week.
RSS Feed

HELP: Tutorials | FAQ
CONTACT US: Contact info

Search Results

Journal Article

Citation

Brungart DS, Kruger SE, Kwiatkowski T, Heil T, Cohen J. Hum. Factors 2019; 61(6): 976-991.

Affiliation

Henry M. Jackson Foundation.

Copyright

(Copyright © 2019, Human Factors and Ergonomics Society, Publisher SAGE Publishing)

DOI

10.1177/0018720819831092

PMID

30870052

Abstract

OBJECTIVE:: The present study was designed to examine the impact that walking has on performance in auditory localization, visual discrimination, and aurally aided visual search tasks.

BACKGROUND:: Auditory localization and visual search are critical skills that are frequently conducted by moving observers, but most laboratory studies of these tasks have been conducted on stationary listeners who were either seated or standing during stimulus presentation.

METHOD:: Thirty participants completed three different tasks while either standing still or while walking at a comfortable self-selected pace on a treadmill: (1) an auditory localization task, where they identified the perceived location of a target sound; (2) a visual discrimination task, where they identified a visual target presented at a known location directly in front of the listener; and (3) an aurally aided visual search task, where they identified a visual target that was presented in the presence of multiple visual distracters either in isolation or in conjunction with a spatially colocated auditory cue.

RESULTS:: Participants who were walking performed auditory localization and aurally aided visual search tasks significantly faster than those who were standing, with no loss in accuracy.

CONCLUSION:: The improved aurally aided visual search performance found in this experiment may be related to enhanced overall activation caused by walking. It is also possible that the slight head movements required may have provided auditory cues that enhanced localization accuracy. APPLICATION:: The results have potential applications in virtual and augmented reality displays where audio cues might be presented to listeners while walking.


Language: en

Keywords

balance; cognition; dual task; kinesthesis; multisensory integration; orientation; proprioception; sensory and perceptual processes; simulation; task switching; time sharing; virtual environments; virtual reality

NEW SEARCH


All SafetyLit records are available for automatic download to Zotero & Mendeley
Print