SAFETYLIT WEEKLY UPDATE

We compile citations and summaries of about 400 new articles every week.
RSS Feed

HELP: Tutorials | FAQ
CONTACT US: Contact info

Search Results

Journal Article

Citation

Schulte J, Kocherovsky M, Paul N, Pleune M, Chung CJ. Vehicles (Basel) 2022; 4(1): 243-258.

Copyright

(Copyright © 2022, MDPI: Multidisciplinary Digital Publications Institute)

DOI

10.3390/vehicles4010016

PMID

unavailable

Abstract

Leader-follower autonomy (LFA) systems have so far only focused on vehicles following other vehicles. Though there have been several decades of research into this topic, there has not yet been any work on human-vehicle leader-follower systems in the known literature. We present a system in which an autonomous vehicle--our ACTor 1 platform--can follow a human leader who controls the vehicle through hand-and-body gestures. We successfully developed a modular pipeline that uses artificial intelligence/deep learning to recognize hand-and-body gestures from a user in view of the vehicle's camera and translate those gestures into physical action by the vehicle. We demonstrate our work using our ACTor 1 platform, a modified Polaris Gem 2.

RESULTS show that our modular pipeline design reliably recognizes human body language and translates the body language into LFA commands in real time. This work has numerous applications such as material transport in industrial contexts.


Language: en

Keywords

autonomous vehicles; deep learning; gesture recognition; leader-follower; machine learning; neural networks; pose estimation; posenet; self-driving car; YOLO

NEW SEARCH


All SafetyLit records are available for automatic download to Zotero & Mendeley
Print