TY - JOUR
T1 - Visual and haptic feedback in detecting motor imagery within a wearable brain-computer interface
AU - Arpaia, Pasquale
AU - Coyle, Damien
AU - Donnarumma, Francesco
AU - Esposito, Antonio
AU - Natalizio, Angela
AU - Parvis, Marco
N1 - Funding Information: This work was carried out as part of the “ICT for Health” project, which was financially supported by the Italian Ministry of Education, University and Research (MIUR) , under the initiative ‘Departments of Excellence’ (Italian Budget Law no. 232/2016), through an excellence grant awarded to the Department of Information Technology and Electrical Engineering of the University of Naples Federico II, Naples, Italy. DC is supported by a UKRI Turing AI Fellowship 2021–2025 funded by the EPSRC (grant number EP/V025724/1 ). FD is supported by the project “Free energy principle and the brain: Neuronal and phylogenetic mechanisms of Bayesian inference” funded by the MIUR PRIN2020 - Grant N. 2020529PCP .
Data Availability
Data will be made available on request.
PY - 2023/1/31
Y1 - 2023/1/31
N2 - This paper presents a wearable brain-computer interface relying on neurofeedback in extended reality for the enhancement of motor imagery training. Visual and vibrotactile feedback modalities were evaluated when presented either singularly or simultaneously. Only three acquisition channels and state-of-the-art vibrotactile chest-based feedback were employed. Experimental validation was carried out with eight subjects participating in two or three sessions on different days, with 360 trials per subject per session. Neurofeedback led to statistically significant improvement in performance over the two/three sessions, thus demonstrating for the first time functionality of a motor imagery-based instrument even by using an utmost wearable electroencephalograph and a commercial gaming vibrotactile suit. In the best cases, classification accuracy exceeded 80% with more than 20% improvement with respect to the initial performance. No feedback modality was generally preferable across the cohort study, but it is concluded that the best feedback modality may be subject-dependent.
AB - This paper presents a wearable brain-computer interface relying on neurofeedback in extended reality for the enhancement of motor imagery training. Visual and vibrotactile feedback modalities were evaluated when presented either singularly or simultaneously. Only three acquisition channels and state-of-the-art vibrotactile chest-based feedback were employed. Experimental validation was carried out with eight subjects participating in two or three sessions on different days, with 360 trials per subject per session. Neurofeedback led to statistically significant improvement in performance over the two/three sessions, thus demonstrating for the first time functionality of a motor imagery-based instrument even by using an utmost wearable electroencephalograph and a commercial gaming vibrotactile suit. In the best cases, classification accuracy exceeded 80% with more than 20% improvement with respect to the initial performance. No feedback modality was generally preferable across the cohort study, but it is concluded that the best feedback modality may be subject-dependent.
KW - Brain–computer interface
KW - Electroencephalography
KW - Extended reality
KW - Haptic
KW - Motor imagery
KW - Neurofeedback
UR - http://www.scopus.com/inward/record.url?scp=85145776362&partnerID=8YFLogxK
UR - https://pure.ulster.ac.uk/en/publications/visual-and-haptic-feedback-in-detecting-motor-imagery-within-a-we
U2 - 10.1016/j.measurement.2022.112304
DO - 10.1016/j.measurement.2022.112304
M3 - Article
SN - 0263-2241
VL - 206
SP - 1
EP - 9
JO - Measurement
JF - Measurement
M1 - 112304
ER -