SAFETYLIT WEEKLY UPDATE

We compile citations and summaries of about 400 new articles every week.
RSS Feed

HELP: Tutorials | FAQ
CONTACT US: Contact info

Search Results

Journal Article

Citation

Zhang S, Abdel-Aty M. Transp. Res. Rec. 2022; 2676(9): 491-501.

Copyright

(Copyright © 2022, Transportation Research Board, National Research Council, National Academy of Sciences USA, Publisher SAGE Publishing)

DOI

10.1177/03611981221087234

PMID

unavailable

Abstract

Drivers? distraction has been widely studied in the field of naturalistic driving studies. However, it is difficult to use traditional variables, such as speed, acceleration, and yaw rate to detect drivers? distraction in real time. Emerging technologies have obtained features from human faces, such as eye gaze, to detect drivers? visual distraction. However, eye gaze is hard to detect in naturalistic driving situations, because of low-resolution cameras, drivers wearing sunglasses, and so forth. Instead, head pose is easier to detect, and has correlation with eye gaze direction. In this study, city-wide videos are collected using onboard cameras from over 289 drivers representing 423 events. Head pose (pitch, yaw, and roll rates) are derived and fed into a convolutional neural network to detect drivers? distraction. The experiment results show that the proposed model can achieve recall value of 0.938 and area under the receiver operating characteristic curve value of 0.931, with variables from five time slices (1.25?s) used as input. The study proves that head pose can be used to detect drivers? distraction. The study offers insights for detecting drivers? distraction and can be used for the development of advanced driver assistance systems.


Language: en

NEW SEARCH


All SafetyLit records are available for automatic download to Zotero & Mendeley
Print