Adaptive Windowing Framework for Surface Electromyogram-Based Pattern Recognition System for Transradial Amputees

Sensors (Basel). 2018 Jul 24;18(8):2402. doi: 10.3390/s18082402.


Electromyogram (EMG)-based Pattern Recognition (PR) systems for upper-limb prosthesis control provide promising ways to enable an intuitive control of the prostheses with multiple degrees of freedom and fast reaction times. However, the lack of robustness of the PR systems may limit their usability. In this paper, a novel adaptive time windowing framework is proposed to enhance the performance of the PR systems by focusing on their windowing and classification steps. The proposed framework estimates the output probabilities of each class and outputs a movement only if a decision with a probability above a certain threshold is achieved. Otherwise (i.e., all probability values are below the threshold), the window size of the EMG signal increases. We demonstrate our framework utilizing EMG datasets collected from nine transradial amputees who performed nine movement classes with Time Domain Power Spectral Descriptors (TD-PSD), Wavelet and Time Domain (TD) feature extraction (FE) methods and a Linear Discriminant Analysis (LDA) classifier. Nonetheless, the concept can be applied to other types of features and classifiers. In addition, the proposed framework is validated with different movement and EMG channel combinations. The results indicate that the proposed framework works well with different FE methods and movement/channel combinations with classification error rates of approximately 13% with TD-PSD FE. Thus, we expect our proposed framework to be a straightforward, yet important, step towards the improvement of the control methods for upper-limb prostheses.

Keywords: Linear Discriminant Analysis; Time-Domain Power Spectral Descriptors; adaptive windowing; classification; pattern recognition; surface electromyogram (sEMG); transradial amputees.

MeSH terms

  • Adult
  • Amputees
  • Artificial Limbs
  • Electromyography*
  • Female
  • Humans
  • Male
  • Middle Aged
  • Movement
  • Pattern Recognition, Automated*
  • Young Adult