TY - JOUR PY - 2022// TI - Multi-modal user experience evaluation on in-vehicle HMI systems using eye-tracking, facial expression, and finger-tracking for the smart cockpit JO - International journal of vehicle performance A1 - Li, Wenbo A1 - Wu, Yingzhang A1 - Zeng, Guanzhong A1 - Ren, Fan A1 - Tang, Mingqing A1 - Xiao, Huafei A1 - Liu, Yujing A1 - Guo, Gang SP - 429 EP - 249 VL - 8 IS - 4 N2 - The trend toward intelligent connected vehicles (ICVs) led to numerous more novel and more natural human-vehicle relationships, which will bring about tremendous changes in smart cockpit functions and interaction methods. However, most in-vehicle human-machine interaction (HMI) systems focus on adding more functions, while few of them focus on the user experience (UX) of the system. This study presents an evaluation method of UX based on eye-tracking, finger movement tracking, and facial expression, the study also proposed a pleasantness prediction based on multi-layer perception (MLP) algorithm using multi-modal data. Through the UX experiment on two in-vehicle HMI systems, the study verified that the proposed evaluation method can be objective and efficient to evaluate the in-vehicle HMI system. Based on the MLP algorithm, the study trained the pleasantness prediction model using multi-modal data. Besides, we collected new data of the third in-vehicle HMI system to test the trained model and presented excellent test results. Keywords: HMI; human-machine interaction; user experience; driver emotion; behaviour analysis; smart cockpit.
Language: en
LA - en SN - 1745-3194 UR - http://dx.doi.org/10.1504/IJVP.2022.125931 ID - ref1 ER -