Design and validation of a computer-based sleep-scoring algorithm

J Neurosci Methods. 2004 Feb 15;133(1-2):71-80. doi: 10.1016/j.jneumeth.2003.09.025.


A computer-based sleep scoring algorithm was devised for the real time scoring of sleep-wake state in Wistar rats. Electroencephalogram (EEG) amplitude (microV(rms)) was measured in the following frequency bands: delta (delta; 1.5-6 Hz), theta (Theta; 6-10 Hz), alpha (alpha; 10.5-15 Hz), beta (beta; 22-30 Hz), and gamma (gamma; 35-45 Hz). Electromyographic (EMG) signals (microV(rms)) were recorded from the levator auris longus (neck) muscle, as this yielded a significantly higher algorithm accuracy than the spinodeltoid (shoulder) or temporalis (head) muscle EMGs (ANOVA; P=0.009). Data were obtained using either tethers (n=10) or telemetry (n=4). We developed a simple three-step algorithm that categorizes behavioural state as wake, non-rapid eye movement (NREM) sleep, rapid eye movement (REM) sleep, based on thresholds set during a manually-scored 90-min preliminary recording. Behavioural state was assigned in 5-s epochs. EMG amplitude and ratios of EEG frequency band amplitudes were measured, and compared with empirical thresholds in each animal.STEP 1: EMG amplitude greater than threshold? Yes: "active" wake, no: sleep or "quiet" wake. STEP 2: EEG amplitude ratio (delta x alpha)/(beta x gamma) greater than threshold? Yes: NREM, no: REM or "quiet" wake. STEP 3: EEG amplitude ratio Theta(2)/(delta x alpha) greater than threshold? Yes: REM, no: "quiet" wake. The algorithm was validated with one, two and three steps. The overall accuracy in discriminating wake and sleep (NREM and REM combined) using step one alone was found to be 90.1%. Overall accuracy using the first two steps was found to be 87.5% in scoring wake, NREM and REM sleep. When all three steps were used, overall accuracy in scoring wake, NREM and REM sleep was determined to be 87.9%. All accuracies were derived from comparisons with unequivocally-scored epochs from four 90-min recordings as defined by an experienced human rater. The algorithms were as reliable as the agreement between three human scorers (88%).

Publication types

  • Comparative Study
  • Research Support, Non-U.S. Gov't
  • Validation Study

MeSH terms

  • Algorithms*
  • Analysis of Variance
  • Animals
  • Electroencephalography / methods
  • Electromyography / methods
  • Male
  • Polysomnography / methods
  • Rats
  • Rats, Wistar
  • Signal Processing, Computer-Assisted / instrumentation*
  • Sleep / physiology*
  • Sleep Stages / physiology*
  • Wakefulness / physiology