Study objectives: Multisensor wearable consumer devices allowing the collection of multiple data sources, such as heart rate and motion, for the evaluation of sleep in the home environment, are increasingly ubiquitous. However, the validity of such devices for sleep assessment has not been directly compared to alternatives such as wrist actigraphy or polysomnography (PSG).
Methods: Eight participants each completed four nights in a sleep laboratory, equipped with PSG and several wearable devices. Registered polysomnographic technologist-scored PSG served as ground truth for sleep-wake state. Wearable devices providing sleep-wake classification data were compared to PSG at both an epoch-by-epoch and night level. Data from multisensor wearables (Apple Watch and Oura Ring) were compared to data available from electrocardiography and a triaxial wrist actigraph to evaluate the quality and utility of heart rate and motion data. Machine learning methods were used to train and test sleep-wake classifiers, using data from consumer wearables. The quality of classifications derived from devices was compared.
Results: For epoch-by-epoch sleep-wake performance, research devices ranged in d' between 1.771 and 1.874, with sensitivity between 0.912 and 0.982, and specificity between 0.366 and 0.647. Data from multisensor wearables were strongly correlated at an epoch-by-epoch level with reference data sources. Classifiers developed from the multisensor wearable data ranged in d' between 1.827 and 2.347, with sensitivity between 0.883 and 0.977, and specificity between 0.407 and 0.821.
Conclusions: Data from multisensor consumer wearables are strongly correlated with reference devices at the epoch level and can be used to develop epoch-by-epoch models of sleep-wake rivaling existing research devices.
Keywords: actigraphy; artificial intelligence; big data; machine learning; polysomnography; smartphone; wearable.
© Sleep Research Society 2020. Published by Oxford University Press on behalf of the Sleep Research Society. All rights reserved. For permissions, please email: firstname.lastname@example.org.