VGPN

Voice-Guided Pointing Robot Navigation for Humans

Jun Hu, Zhongyu Jiang, Xionghao Ding, Taijiang Mu, Peter Hall

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Citation (Scopus)

Abstract

Pointing gestures are widely used in robot navigation approaches nowadays. However, most approaches only use pointing gestures, and these have two major limitations. Firstly, they need to recognize pointing gestures all the time, which leads to long processing time and significant system overheads. Secondly, the user's pointing direction may not be very accurate, so the robot may go to an undesired place. To relieve these limitations, we propose a voice-guided pointing robot navigation approach named VGPN, and implement its prototype on a wheeled robot, TurtleBot 2. VGPN recognizes a pointing gesture only if voice information is insufficient for navigation. VGPN also uses voice information as a supplementary channel to help determine the target position of the user's pointing gesture. In the evaluation, we compare VGPN to the pointing-only navigation approach. The results show that VGPN effectively reduces the processing time cost when pointing gesture is unnecessary, and improves the user satisfaction with navigation accuracy.

Original languageEnglish
Title of host publication2018 IEEE International Conference on Robotics and Biomimetics, ROBIO 2018
PublisherIEEE
Pages1107-1112
Number of pages6
ISBN (Electronic)9781728103761
DOIs
Publication statusPublished - 14 Mar 2019
Event2018 IEEE International Conference on Robotics and Biomimetics, ROBIO 2018 - Kuala Lumpur, Malaysia
Duration: 12 Dec 201815 Dec 2018

Conference

Conference2018 IEEE International Conference on Robotics and Biomimetics, ROBIO 2018
CountryMalaysia
CityKuala Lumpur
Period12/12/1815/12/18

Keywords

  • navigation
  • pointing gesture
  • robot
  • voice

ASJC Scopus subject areas

  • Biotechnology
  • Artificial Intelligence
  • Human-Computer Interaction

Cite this

Hu, J., Jiang, Z., Ding, X., Mu, T., & Hall, P. (2019). VGPN: Voice-Guided Pointing Robot Navigation for Humans. In 2018 IEEE International Conference on Robotics and Biomimetics, ROBIO 2018 (pp. 1107-1112). [8664854] IEEE. https://doi.org/10.1109/ROBIO.2018.8664854

VGPN : Voice-Guided Pointing Robot Navigation for Humans. / Hu, Jun; Jiang, Zhongyu; Ding, Xionghao; Mu, Taijiang; Hall, Peter.

2018 IEEE International Conference on Robotics and Biomimetics, ROBIO 2018. IEEE, 2019. p. 1107-1112 8664854.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Hu, J, Jiang, Z, Ding, X, Mu, T & Hall, P 2019, VGPN: Voice-Guided Pointing Robot Navigation for Humans. in 2018 IEEE International Conference on Robotics and Biomimetics, ROBIO 2018., 8664854, IEEE, pp. 1107-1112, 2018 IEEE International Conference on Robotics and Biomimetics, ROBIO 2018, Kuala Lumpur, Malaysia, 12/12/18. https://doi.org/10.1109/ROBIO.2018.8664854
Hu J, Jiang Z, Ding X, Mu T, Hall P. VGPN: Voice-Guided Pointing Robot Navigation for Humans. In 2018 IEEE International Conference on Robotics and Biomimetics, ROBIO 2018. IEEE. 2019. p. 1107-1112. 8664854 https://doi.org/10.1109/ROBIO.2018.8664854
Hu, Jun ; Jiang, Zhongyu ; Ding, Xionghao ; Mu, Taijiang ; Hall, Peter. / VGPN : Voice-Guided Pointing Robot Navigation for Humans. 2018 IEEE International Conference on Robotics and Biomimetics, ROBIO 2018. IEEE, 2019. pp. 1107-1112
@inproceedings{7ee2d15ee2e54b46afbe4947fe61fd21,
title = "VGPN: Voice-Guided Pointing Robot Navigation for Humans",
abstract = "Pointing gestures are widely used in robot navigation approaches nowadays. However, most approaches only use pointing gestures, and these have two major limitations. Firstly, they need to recognize pointing gestures all the time, which leads to long processing time and significant system overheads. Secondly, the user's pointing direction may not be very accurate, so the robot may go to an undesired place. To relieve these limitations, we propose a voice-guided pointing robot navigation approach named VGPN, and implement its prototype on a wheeled robot, TurtleBot 2. VGPN recognizes a pointing gesture only if voice information is insufficient for navigation. VGPN also uses voice information as a supplementary channel to help determine the target position of the user's pointing gesture. In the evaluation, we compare VGPN to the pointing-only navigation approach. The results show that VGPN effectively reduces the processing time cost when pointing gesture is unnecessary, and improves the user satisfaction with navigation accuracy.",
keywords = "navigation, pointing gesture, robot, voice",
author = "Jun Hu and Zhongyu Jiang and Xionghao Ding and Taijiang Mu and Peter Hall",
year = "2019",
month = "3",
day = "14",
doi = "10.1109/ROBIO.2018.8664854",
language = "English",
pages = "1107--1112",
booktitle = "2018 IEEE International Conference on Robotics and Biomimetics, ROBIO 2018",
publisher = "IEEE",
address = "USA United States",

}

TY - GEN

T1 - VGPN

T2 - Voice-Guided Pointing Robot Navigation for Humans

AU - Hu, Jun

AU - Jiang, Zhongyu

AU - Ding, Xionghao

AU - Mu, Taijiang

AU - Hall, Peter

PY - 2019/3/14

Y1 - 2019/3/14

N2 - Pointing gestures are widely used in robot navigation approaches nowadays. However, most approaches only use pointing gestures, and these have two major limitations. Firstly, they need to recognize pointing gestures all the time, which leads to long processing time and significant system overheads. Secondly, the user's pointing direction may not be very accurate, so the robot may go to an undesired place. To relieve these limitations, we propose a voice-guided pointing robot navigation approach named VGPN, and implement its prototype on a wheeled robot, TurtleBot 2. VGPN recognizes a pointing gesture only if voice information is insufficient for navigation. VGPN also uses voice information as a supplementary channel to help determine the target position of the user's pointing gesture. In the evaluation, we compare VGPN to the pointing-only navigation approach. The results show that VGPN effectively reduces the processing time cost when pointing gesture is unnecessary, and improves the user satisfaction with navigation accuracy.

AB - Pointing gestures are widely used in robot navigation approaches nowadays. However, most approaches only use pointing gestures, and these have two major limitations. Firstly, they need to recognize pointing gestures all the time, which leads to long processing time and significant system overheads. Secondly, the user's pointing direction may not be very accurate, so the robot may go to an undesired place. To relieve these limitations, we propose a voice-guided pointing robot navigation approach named VGPN, and implement its prototype on a wheeled robot, TurtleBot 2. VGPN recognizes a pointing gesture only if voice information is insufficient for navigation. VGPN also uses voice information as a supplementary channel to help determine the target position of the user's pointing gesture. In the evaluation, we compare VGPN to the pointing-only navigation approach. The results show that VGPN effectively reduces the processing time cost when pointing gesture is unnecessary, and improves the user satisfaction with navigation accuracy.

KW - navigation

KW - pointing gesture

KW - robot

KW - voice

UR - http://www.scopus.com/inward/record.url?scp=85064119528&partnerID=8YFLogxK

U2 - 10.1109/ROBIO.2018.8664854

DO - 10.1109/ROBIO.2018.8664854

M3 - Conference contribution

SP - 1107

EP - 1112

BT - 2018 IEEE International Conference on Robotics and Biomimetics, ROBIO 2018

PB - IEEE

ER -