iCatcher: A neural network approach for automated coding of young children's eye movements

Infancy. 2022 Jul;27(4):765-779. doi: 10.1111/infa.12468. Epub 2022 Apr 13.

Abstract

Infants' looking behaviors are often used for measuring attention, real-time processing, and learning-often using low-resolution videos. Despite the ubiquity of gaze-related methods in developmental science, current analysis techniques usually involve laborious post hoc coding, imprecise real-time coding, or expensive eye trackers that may increase data loss and require a calibration phase. As an alternative, we propose using computer vision methods to perform automatic gaze estimation from low-resolution videos. At the core of our approach is a neural network that classifies gaze directions in real time. We compared our method, called iCatcher, to manually annotated videos from a prior study in which infants looked at one of two pictures on a screen. We demonstrated that the accuracy of iCatcher approximates that of human annotators and that it replicates the prior study's results. Our method is publicly available as an open-source repository at https://github.com/yoterel/iCatcher.

Publication types

  • Research Support, N.I.H., Extramural

MeSH terms

  • Attention
  • Child
  • Child, Preschool
  • Eye Movements*
  • Humans
  • Infant
  • Learning
  • Neural Networks, Computer*