SAFETYLIT WEEKLY UPDATE

We compile citations and summaries of about 400 new articles every week.
RSS Feed

HELP: Tutorials | FAQ
CONTACT US: Contact info

Search Results

Journal Article

Citation

Ahlstrom C, Kircher K. IEEE Trans. Intel. Transp. Syst. 2017; 18(11): 2929-2938.

Copyright

(Copyright © 2017, IEEE (Institute of Electrical and Electronics Engineers))

DOI

10.1109/TITS.2017.2658945

PMID

unavailable

Abstract

Indicators based on visual time-sharing have been used to investigate drivers' visual behaviour during additional task execution. However, visual time-sharing analyses have been restricted to additional tasks with well-defined temporal start and end points and a dedicated visual target area. We introduce a method to automatically extract visual time-sharing sequences directly from eye tracking data. This facilitates investigations of systems, providing continuous information without well-defined start and end points. Furthermore, it becomes possible to investigate time-sharing behavior with other types of glance targets such as the mirrors. Time-sharing sequences are here extracted based on between-glance durations. If glances to a particular target are separated by less than a time-based threshold value, we assume that they belong to the same information intake event. Our results indicate that a 4-s threshold is appropriate. Examples derived from 12 drivers (about 100 hours of eye tracking data), collected in an on-road investigation of an in-vehicle information system, are provided to illustrate sequence-based analyses. This includes the possibility to investigate human-machine interface designs based on the number of glances in the extracted sequences, and to increase the legibility of transition matrices by deriving them from time-sharing sequences instead of single glances. More object-oriented glance behavior analyses, based on additional sensor and information fusion, are identified as the next future step. This would enable automated extraction of time-sharing sequences not only for targets fixed in the vehicle's coordinate system, but also for environmental and traffic targets that move independently of the driver's vehicle.


Language: en

Keywords

Data mining; data visualisation; Driver behaviour; driver information systems; driver visual behaviour; eye tracking data; feature extraction; Gaze tracking; glance analysis; glance behavior analyses; human computer interaction; human-machine interface designs; in-vehicle information system; Mirrors; naturalistic driving data; road traffic; Roads; time-based threshold value; traffic engineering computing; transition matrices; Uncertainty; user interfaces; Vehicles; visual time-sharing; visual time-sharing sequence extraction; Visualization

NEW SEARCH


All SafetyLit records are available for automatic download to Zotero & Mendeley
Print