SAFETYLIT WEEKLY UPDATE

We compile citations and summaries of about 400 new articles every week.
RSS Feed

HELP: Tutorials | FAQ
CONTACT US: Contact info

Search Results

Journal Article

Citation

Gao J, Murphey YL, Zhu H. SAE Int. J. Transp. Safety 2018; 6(2): 147-162.

Copyright

(Copyright © 2018, SAE International)

DOI

10.4271/09-06-02-0010

PMID

unavailable

Abstract

Sideswipe accidents occur primarily when drivers attempt an improper lane change, drift out of lane, or the vehicle loses lateral traction. In this article, a fusion approach is introduced that utilizes data from two differing modality sensors (a front-view camera and an onboard diagnostics (OBD) sensor) for the purpose of detecting driver's behavior of lane changing. For lane change detection, both feature-level fusion and decision-level fusion are examined by using a collaborative representation classifier (CRC). Computationally efficient detection features are extracted from distances to the detected lane boundaries and vehicle dynamics signals. In the feature-level fusion, features generated from two differing modality sensors are merged before classification, while in the decision-level fusion, the Dempster-Shafer (D-S) theory is used to combine the classification outcomes from two classifiers, each corresponding to one sensor. The results indicated that the feature-level fusion outperformed the decision-level fusion, and the introduced fusion approach using a CRC performs significantly better in terms of detection accuracy, in comparison to other state-of-the-art classifiers.


Language: en

NEW SEARCH


All SafetyLit records are available for automatic download to Zotero & Mendeley
Print