SAFETYLIT WEEKLY UPDATE

We compile citations and summaries of about 400 new articles every week.
RSS Feed

HELP: Tutorials | FAQ
CONTACT US: Contact info

Search Results

Journal Article

Citation

Schnebelen D, Charron C, Mars F. J. Eye Mov. Res. 2021; 12(3): e10.

Copyright

(Copyright © 2021, Bern Open Publishing)

DOI

10.16910/jemr.12.3.10

PMID

34122744

Abstract

When manually steering a car, the driver's visual perception of the driving scene and his or her motor actions to control the vehicle are closely linked. Since motor behaviour is no longer required in an automated vehicle, the sampling of the visual scene is affected. Autonomous driving typically results in less gaze being directed towards the road centre and a broader exploration of the driving scene, compared to manual driving. To examine the corollary of this situation, this study estimated the state of automation (manual or automated) on the basis of gaze behaviour. To do so, models based on partial least square regressions were computed by considering the gaze behaviour in multiple ways, using static indicators (percentage of time spent gazing at 13 areas of interests), dynamic indicators (transition matrices between areas) or both together. Analysis of the quality of predictions for the different models showed that the best result was obtained by considering both static and dynamic indicators. However, gaze dynamics played the most important role in distinguishing between manual and automated driving. This study may be relevant to the issue of driver monitoring in autonomous vehicles.


Language: en

Keywords

automated driving; eye movement; gaze behaviour; gaze dynamics; manual driving; region of interest

NEW SEARCH


All SafetyLit records are available for automatic download to Zotero & Mendeley
Print