Abstract
In this study we propose a “hand gesture + face expression” human machine interaction technique, and apply this technique to bedridden rehabilitation robot. “Hand gesture + Facial expression” interactive technology combines the input mode of gesture and facial expression perception. It involves seven basic facial expressions that can be used to determine a target selecting task, while hand gestures are used to control a cursor’s location. A controlled experiment was designed and conducted to evaluate the effectiveness of the proposed hybrid technology. A series of target selecting tasks with different target widths and layouts were designed to examine the recognition accuracy of hybrid control gestures. An interactive experiment applied to a rehabilitation robot is designed to verify the feasibility of this interactive technology applied to rehabilitation robots. The experimental results show that the “hand + facial expression” interactive gesture has strong robustness, which can provide a novel guideline for designing applications in VR interfaces, and it can be applied to the rehabilitation robots.
Original language | English |
---|---|
Article number | 237 |
Journal | Sensors |
Volume | 23 |
Issue number | 1 |
Early online date | 26 Dec 2022 |
DOIs | |
Publication status | Published - 31 Jan 2023 |
Bibliographical note
Funding Information:This work was supported in part by the National Science Foundation of China awards (U1913601, 52175033 and U21A20120), the National Key R&D Program of China (2020YFC2007800), the Zhejiang Provincial Natural Science Foundation under award LZ20E050002, the Key Research and Development Program of Zhejiang under awards 2022C03103 and 2021C03051.
Keywords
- facial expression
- hybrid control gestures
- interactive tasks
- rehabilitation robot
- target selection
ASJC Scopus subject areas
- Analytical Chemistry
- Information Systems
- Biochemistry
- Atomic and Molecular Physics, and Optics
- Instrumentation
- Electrical and Electronic Engineering