Hybrid Target Selections by ”Hand Gestures + Facial Expression” for a Rehabilitation Robot

Yi Han, Xiangliang Zhang, Ning Zhang, Shuguang Meng, Tao Liu, Shuoyu Wang, Min Pan, Xiufeng Zhang, Jingang Yi

Research output: Contribution to journalArticlepeer-review

Abstract

In this study we propose a “hand gesture + face expression” human machine interaction technique, and apply this technique to bedridden rehabilitation robot. “Hand gesture + Facial expression” interactive technology combines the input mode of gesture and facial expression perception. It involves seven basic facial expressions that can be used to determine a target selecting task, while hand gestures are used to control a cursor’s location. A controlled experiment was designed and conducted to evaluate the effectiveness of the proposed hybrid technology. A series of target selecting tasks with different target widths and layouts were designed to examine the recognition accuracy of hybrid control gestures. An interactive experiment applied to a rehabilitation robot is designed to verify the feasibility of this interactive technology applied to rehabilitation robots. The experimental results show that the “hand + facial expression” interactive gesture has strong robustness, which can provide a novel guideline for designing applications in VR interfaces, and it can be applied to the rehabilitation robots.
Original languageEnglish
Article number237
JournalSensors
Volume23
Issue number1
Early online date26 Dec 2022
DOIs
Publication statusPublished - 31 Jan 2023

Keywords

  • facial expression
  • hybrid control gestures
  • interactive tasks
  • rehabilitation robot
  • target selection

ASJC Scopus subject areas

  • Analytical Chemistry
  • Information Systems
  • Biochemistry
  • Atomic and Molecular Physics, and Optics
  • Instrumentation
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'Hybrid Target Selections by ”Hand Gestures + Facial Expression” for a Rehabilitation Robot'. Together they form a unique fingerprint.

Cite this