A practical efficient human computer interface based on saccadic eye movements for people with disabilities

Comput Biol Med. 2016 Mar 1:70:163-173. doi: 10.1016/j.compbiomed.2016.01.012. Epub 2016 Jan 23.

Abstract

Human computer interfaces (HCI) provide new channels of communication for people with severe motor disabilities to state their needs, and control their environment. Some HCI systems are based on eye movements detected from the electrooculogram. In this study, a wearable HCI, which implements a novel adaptive algorithm for detection of saccadic eye movements in eight directions, was developed, considering the limitations that people with disabilities have. The adaptive algorithm eliminated the need for calibration of the system for different users and in different environments. A two-stage typing environment and a simple game for training people with disabilities to work with the system were also developed. Performance of the system was evaluated in experiments with the typing environment performed by six participants without disabilities. The average accuracy of the system in detecting eye movements and blinking was 82.9% at first tries with an average typing rate of 4.5cpm. However an experienced user could achieve 96% accuracy and 7.2cpm typing rate. Moreover, the functionality of the system for people with movement disabilities was evaluated by performing experiments with the game environment. Six people with tetraplegia and significant levels of speech impairment played with the computer game several times. The average success rate in performing the necessary eye movements was 61.5%, which increased significantly with practice up to 83% for one participant. The developed system is 2.6×4.5cm in size and weighs only 15g, assuring high level of comfort for the users.

Keywords: Electro-oculogram; Human computer interface; People with disabilities; Saccadic eye movement; Wearable systems.

MeSH terms

  • Algorithms*
  • Disabled Persons*
  • Eye Movements*
  • Humans
  • Male
  • User-Computer Interface*