Give me a sign: decoding four complex hand gestures based on high-density ECoG

Brain Struct Funct. 2016 Jan;221(1):203-16. doi: 10.1007/s00429-014-0902-x. Epub 2014 Oct 2.

Abstract

The increasing understanding of human brain functions makes it possible to directly interact with the brain for therapeutic purposes. Implantable brain computer interfaces promise to replace or restore motor functions in patients with partial or complete paralysis. We postulate that neuronal states associated with gestures, as they are used in the finger spelling alphabet of sign languages, provide an excellent signal for implantable brain computer interfaces to restore communication. To test this, we evaluated decodability of four gestures using high-density electrocorticography in two participants. The electrode grids were located subdurally on the hand knob area of the sensorimotor cortex covering a surface of 2.5-5.2 cm(2). Using a pattern-matching classification approach four types of hand gestures were classified based on their pattern of neuronal activity. In the two participants the gestures were classified with 97 and 74% accuracy. The high frequencies (>65 Hz) allowed for the best classification results. This proof-of-principle study indicates that the four gestures are associated with a reliable and discriminable spatial representation on a confined area of the sensorimotor cortex. This robust representation on a small area makes hand gestures an interesting control feature for an implantable BCI to restore communication for severely paralyzed people.

Keywords: Decoding; Electrocorticography; Gestures; High density; Sign language.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Adult
  • Brain-Computer Interfaces*
  • Electrocorticography / methods*
  • Female
  • Gestures*
  • Hand / physiology*
  • Humans
  • Middle Aged
  • Pattern Recognition, Automated / methods*
  • Sensorimotor Cortex / physiology*
  • Sign Language
  • Signal Processing, Computer-Assisted
  • Young Adult