EEG based mobile robot control through an adaptive brain-robot interface

Viabhav Gandhi, Girijesh Prasad, DH Coyle, Laxmidhar Behera, Thomas Martin McGinnity

Research output: Contribution to journalArticlepeer-review

101 Citations (SciVal)

Abstract

A major challenge in two-class brain-computer interface (BCI) systems is the low bandwidth of the communication channel, especially while communicating and controlling assistive devices, such as a smart wheelchair or a telepresence mobile robot, which requires multiple motion command options in the form of forward, left, right, backward, and start/stop. To address this, an adaptive user-centric graphical user interface referred to as the intelligent adaptive user interface (iAUI) based on an adaptive shared control mechanism is proposed. The iAUI offers multiple degrees-of-freedom control of a robotic device by providing a continuously updated prioritized list of all the options for selection to the BCI user, thereby improving the information transfer rate. Results have been verified with multiple participants controlling a simulated as well as physical pioneer robot.
Original languageEnglish
Pages (from-to)1278-1285
Number of pages8
JournalIEEE Transactions on Systems, Man, and Cybernetics: Systems
Volume44
Issue number9
Early online date11 Apr 2014
DOIs
Publication statusPublished - 30 Sept 2014

Keywords

  • Mobile robots
  • Robot sensing systems
  • Graphical user interfaces
  • Accuracy
  • Wheelchairs

Fingerprint

Dive into the research topics of 'EEG based mobile robot control through an adaptive brain-robot interface'. Together they form a unique fingerprint.

Cite this