A computational model of motion sickness dynamics during passive self-motion in the dark

Exp Brain Res. 2024 Mar 15. doi: 10.1007/s00221-024-06804-z. Online ahead of print.

Abstract

Predicting the time course of motion sickness symptoms enables the evaluation of provocative stimuli and the development of countermeasures for reducing symptom severity. In pursuit of this goal, we present an Observer-driven model of motion sickness for passive motions in the dark. Constructed in two stages, this model predicts motion sickness symptoms by bridging sensory conflict (i.e., differences between actual and expected sensory signals) arising from the Observer model of spatial orientation perception (stage 1) to Oman's model of motion sickness symptom dynamics (stage 2; presented in 1982 and 1990) through a proposed "Normalized Innovation Squared" statistic. The model outputs the expected temporal development of human motion sickness symptom magnitudes (mapped to the Misery Scale) at a population level, due to arbitrary, 6-degree-of-freedom, self-motion stimuli. We trained model parameters using individual subject responses collected during fore-aft translations and off-vertical axis of rotation motions. Improving on prior efforts, we only used datasets with experimental conditions congruent with the perceptual stage (i.e., adequately provided passive motions without visual cues) to inform the model. We assessed model performance by predicting an unseen validation dataset, producing a Q2 value of 0.91. Demonstrating this model's broad applicability, we formulate predictions for a host of stimuli, including translations, earth-vertical rotations, and altered gravity, and we provide our implementation for other users. Finally, to guide future research efforts, we suggest how to rigorously advance this model (e.g., incorporating visual cues, active motion, responses to motion of different frequency, etc.).

Keywords: Orientation perception; Predictive modeling; Sensory conflict; Spatial disorientation; Vestibular.