SAFETYLIT WEEKLY UPDATE

We compile citations and summaries of about 400 new articles every week.
RSS Feed

HELP: Tutorials | FAQ
CONTACT US: Contact info

Search Results

Journal Article

Citation

Du G, Zhang L, Su K, Wang X, Teng S, Liu PX. IEEE Trans. Intel. Transp. Syst. 2022; 23(11): 21810-21820.

Copyright

(Copyright © 2022, IEEE (Institute of Electrical and Electronics Engineers))

DOI

10.1109/TITS.2022.3176973

PMID

unavailable

Abstract

Existing visual-based fatigue detection methods usually monitor drivers' fatigue by capturing their facial features, including eyelid movements, yawn frequency and head pose. However, these approaches typically do not take drivers' biological signals into consideration. An accurate model for fatigue detection requires combining both facial behavior and biological data. This paper proposes a novel non-intrusive method for driver multimodal fusion fatigue detection by extracting eyelid features and heart rate signals from the RGB video. The multimodal feature fusion method could significantly increase the accuracy of fatigue detection. Specifically, we established two fatigue detection models based on heart rate and the PERCLOS value respectively with one-dimensional Convolutional Neural Network (1D CNN), where the PERCLOS refers to the percentage of eyelid closure over the pupil. Finally, the outputs of the two models are weighted to achieve the multimodal fusion fatigue detection. Simulation results show that our method yield better performance than traditional methods.


Language: en

Keywords

Brain modeling; Eyelids; Fatigue; fatigue driving detection; Feature extraction; heart rate; Heart rate; Iris; Multimodal feature fusion; PERCLOS; Vehicles

NEW SEARCH


All SafetyLit records are available for automatic download to Zotero & Mendeley
Print