Stress Estimation Using Biometric and Activity Indicators to Improve QoL of the Elderly

Sensors (Basel). 2023 Jan 3;23(1):535. doi: 10.3390/s23010535.

Abstract

It is essential to estimate the stress state of the elderly to improve their QoL. Stress states change every day and hour, depending on the activities performed and the duration/intensity. However, most existing studies estimate stress states using only biometric information or specific activities (e.g., sleep duration, exercise duration/amount, etc.) as explanatory variables and do not consider all daily living activities. It is necessary to link various daily living activities and biometric information in order to estimate the stress state more accurately. Specifically, we construct a stress estimation model using machine learning with the answers to a stress status questionnaire obtained every morning and evening as the ground truth and the biometric data during each of the performed activities and the new proposed indicator including biological and activity perspectives as the features. We used the following methods: Baseline Method 1, in which the RRI variance and Lorenz plot area for 4 h after waking and 24 h before the questionnaire were used as features; Baseline Method 2, in which sleep time was added as a feature to Baseline Method 1; the proposed method, in which the Lorenz plot area per activity and total time per activity were added. We compared the results with the proposed method, which added the new indicators as the features. The results of the evaluation experiments using the one-month data collected from five elderly households showed that the proposed method had an average estimation accuracy of 59%, 7% better than Baseline Method 1 (52%) and 4% better than Baseline Method 2 (55%).

Keywords: biometric data; daily living activity; stress level estimation; wearable sensors.

MeSH terms

  • Activities of Daily Living*
  • Aged
  • Biometry
  • Humans
  • Machine Learning
  • Quality of Life*
  • Surveys and Questionnaires

Grants and funding

This work was supported by JSPS KAKENHI Grant Number JP21K19828.