Discovery and recognition of motion primitives in human activities

PLoS One. 2019 Apr 1;14(4):e0214499. doi: 10.1371/journal.pone.0214499. eCollection 2019.

Abstract

We present a novel framework for the automatic discovery and recognition of motion primitives in videos of human activities. Given the 3D pose of a human in a video, human motion primitives are discovered by optimizing the 'motion flux', a quantity which captures the motion variation of a group of skeletal joints. A normalization of the primitives is proposed in order to make them invariant with respect to a subject anatomical variations and data sampling rate. The discovered primitives are unknown and unlabeled and are unsupervisedly collected into classes via a hierarchical non-parametric Bayes mixture model. Once classes are determined and labeled they are further analyzed for establishing models for recognizing discovered primitives. Each primitive model is defined by a set of learned parameters. Given new video data and given the estimated pose of the subject appearing on the video, the motion is segmented into primitives, which are recognized with a probability given according to the parameters of the learned models. Using our framework we build a publicly available dataset of human motion primitives, using sequences taken from well-known motion capture datasets. We expect that our framework, by providing an objective way for discovering and categorizing human motion, will be a useful tool in numerous research fields including video analysis, human inspired motion generation, learning by demonstration, intuitive human-robot interaction, and human behavior analysis.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Algorithms
  • Bayes Theorem
  • Biomechanical Phenomena
  • Computer Simulation
  • Human Activities*
  • Humans
  • Image Processing, Computer-Assisted
  • Imaging, Three-Dimensional
  • Learning
  • Markov Chains
  • Movement*
  • Neurophysiology
  • Normal Distribution
  • Pattern Recognition, Automated / methods*
  • Probability
  • Robotics / methods
  • Sports
  • Video Recording*

Grants and funding

This work is supported by the EU H2020 SecondHands project, url: https://secondhands.eu/; research and Innovation programme (call: H2020-ICT-2014-1, RIA) under grant agreement No 643950 (Recipient of funding F.P.). The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.