Detecting in-car VR Motion Sickness from Lower Face Action Units

Gang Li, Tanaya Guha, Ogechi Onuoha, Zhanyan Qiu, Alana Grant, Zejian Feng, Zirui Zhang, Kathariana Pohlmann, Mark Mcgill, Stephen Brewster, Frank Pollick

Research output: Chapter or section in a book/report/conference proceedingChapter in a published conference proceeding

Abstract

This paper presents the first in-car VR motion sickness (VRMS) detection model based on lower face action units (LF-AUs). Initially developed in a simulated in-car environment with 78 participants, the model's generalizability was later tested in realworld driving conditions. Motion sickness was induced using visual linear motion in the VR headset and physical horizontal rotation via a rotating chair. We used a convolutional neural network (MobileNetV3) to automatically extract LF-AUs from images of the users' mouth region, captured by the VR headset's built-in camera. These LF-AUs were then used to train a Support Vector Regression (SVR) model to estimate motion sickness scores. We compared the SVR model's performance using LF-AUs, pupil diameters, and physiological features (individually and in combination) from the same VR headset. Results showed that both individual LF-AU (right dimple) and combined LF-AUs had significant Pearson correlations with self-reported motion sickness scores and achieved lower root mean squared error compared to pupil diameters. The best detection results were obtained by combining LF-AUs and pupil diameters, while physiological features alone did not yield significant results. The LF-AUs-based model demonstrated encouraging generalizability across different settings in the independent studies.

Original languageEnglish
Title of host publicationProceedings - 2024 IEEE International Symposium on Mixed and Augmented Reality, ISMAR 2024
EditorsUlrich Eck, Misha Sra, Jeanine Stefanucci, Maki Sugimoto, Markus Tatzgern, Ian Williams
PublisherIEEE
Pages1019-1028
Number of pages10
ISBN (Electronic)9798331516475
DOIs
Publication statusPublished - 28 Nov 2024
Event23rd IEEE International Symposium on Mixed and Augmented Reality, ISMAR 2024 - Seattle, USA United States
Duration: 21 Oct 202425 Oct 2024

Publication series

NameProceedings - 2024 IEEE International Symposium on Mixed and Augmented Reality, ISMAR 2024

Conference

Conference23rd IEEE International Symposium on Mixed and Augmented Reality, ISMAR 2024
Country/TerritoryUSA United States
CitySeattle
Period21/10/2425/10/24

Funding

This work was supported in part by the European Research Council (ERC) through the European Union's Horizon 2020 Research and Innovation Programme under Grant 835197, and in part by Engineering and Physical Sciences Research Council Impact Account Acceleration (EPSRC IAA) under Grant EP/X5257161/1 and the Medical Research Council IAA (MRC IAA) under Grant MR/X502807/1.

FundersFunder number
Engineering and Physical Sciences Research CouncilEP/X5257161/1
Medical Research CouncilMR/X502807/1.

Keywords

  • Lower Face Action Units
  • Motion Sickness
  • VR

ASJC Scopus subject areas

  • Computer Science Applications
  • Media Technology
  • Modelling and Simulation

Fingerprint

Dive into the research topics of 'Detecting in-car VR Motion Sickness from Lower Face Action Units'. Together they form a unique fingerprint.

Cite this