Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2015 Nov 18;35(46):15466-76.
doi: 10.1523/JNEUROSCI.2747-15.2015.

Hand Shape Representations in the Human Posterior Parietal Cortex

Affiliations

Hand Shape Representations in the Human Posterior Parietal Cortex

Christian Klaes et al. J Neurosci. .

Abstract

Humans shape their hands to grasp, manipulate objects, and to communicate. From nonhuman primate studies, we know that visual and motor properties for grasps can be derived from cells in the posterior parietal cortex (PPC). Are non-grasp-related hand shapes in humans represented similarly? Here we show for the first time how single neurons in the PPC of humans are selective for particular imagined hand shapes independent of graspable objects. We find that motor imagery to shape the hand can be successfully decoded from the PPC by implementing a version of the popular Rock-Paper-Scissors game and its extension Rock-Paper-Scissors-Lizard-Spock. By simultaneous presentation of visual and auditory cues, we can discriminate motor imagery from visual information and show differences in auditory and visual information processing in the PPC. These results also demonstrate that neural signals from human PPC can be used to drive a dexterous cortical neuroprosthesis.

Significance statement: This study shows for the first time hand-shape decoding from human PPC. Unlike nonhuman primate studies in which the visual stimuli are the objects to be grasped, the visually cued hand shapes that we use are independent of the stimuli. Furthermore, we can show that distinct neuronal populations are activated for the visual cue and the imagined hand shape. Additionally we found that auditory and visual stimuli that cue the same hand shape are processed differently in PPC. Early on in a trial, only the visual stimuli and not the auditory stimuli can be decoded. During the later stages of a trial, the motor imagery for a particular hand shape can be decoded for both modalities.

Keywords: audio processing; brain–machine interface; grasping; hand shaping; motor imagery; posterior parietal cortex.

PubMed Disclaimer

Figures

Figure 1.
Figure 1.
Implantation site and schematic overview of tasks. a, MRI of the implantation site for the BA5 and the AIP array. Some characteristic sulci are overlaid in red for better orientation. The most active regions from the fMRI task (see “fMRI task and array locations”) are outlined for reaching (purple) and grasping (light blue). b, Sketch of task progression showing the three distinct phases of cue presentation, delay, and response with their respective lengths. c, Table listing the names of the hand shapes (which also correspond to the auditory cues given in the CC task), their symbolic representations on the screen, corresponding hand shapes (as performed by the robotic arm), and the color code used for cue-based analyses. d, Schematic sketch of the GT task. The robotic hand would alternate between the two hand shapes rock (closed hand) and paper (open hand) for 60 trials total, and after that 30 relaxation trials would follow (see “Online control and grasp training task”).
Figure 2.
Figure 2.
Example motor-imagery neurons. ac, Top row, Example neurons from the RPS and the bottom row (d, e) neurons from the RPSLS task. Each plot shows the average firing rates (solid line; shaded area = SD) for 10 trials of each cued symbol during the task. Vertical lines indicate the onset of the cue (yellow shading = time period during which the cue symbol was visible) and response phase. Selection criterion for neurons was significant tuning for one of the cue symbols during the response time window (gray bar; see Materials and Methods). The orange bar shows the cue time window, in which only the paper neuron is also tuned.
Figure 3.
Figure 3.
Example visual neurons from the RPS task. Plots have been prepared in the same way as described for Figure 2. Selection criterion for neurons was significant tuning for one of the cue symbols during the cue time window (orange bar). From left to right neurons are selective for rock (a), paper (b), and scissors (c).
Figure 4.
Figure 4.
Statistics of tuned units. Units were recorded during the RPS (a) and RPSLS (b) tasks. Percentage of units tuned in either the cue or response time window (left). Units were tuned either exclusively in the cue (orange) or response (light gray) time window or in both (the two dark gray sectors). If a unit was tuned in both time windows it could either be tuned for the same symbol (“v” for visuomotor; dark gray) or for different symbols in each time window (“s” for switching; darkest gray). Percentage of all tuned units that were tuned during the cue (middle) and response (right) time window sorted by their preferred symbol. Note that the total for the cue and response phase pie charts (middle and right) does not add up to the total from the unit tuning pie chart (left) because visuomotor and switching units are included in both charts.
Figure 5.
Figure 5.
Spatial distribution of tuned units on the AIP array. Total number of tuned units per channel (electrode) on the array (top down view onto the pad) in the RPS (a, b) and the RPSLS (d, e) tasks. Tuned units are shown for the cue (a, d) and response (b, e) phases. The electrodes in the four corners of the array were used as references. Orientation of the array on the cortex is indicated by the letters A (anterior) and L (lateral). The contrast plots show how many units were tuned for the cue phase relative to the response phase for the RPS (c) and RPSLS (f) tasks. A contrast ratio of 1 means that all units found on the channel were only tuned for the cue phase and a value of −1 means that all units were only tuned in the response phase. Striped channels in the contrast plots mean that no tuned units were recorded on the channel. Contrast was calculated as c = tctrtc+tr, where tc is the number of tuned cells during the cue phase and tr is the number of tuned cells during the response phase.
Figure 6.
Figure 6.
Continuous decoding performance during the CC task. E.G.S. was either instructed to attend and respond to the visual cue (a) or the auditory cue (b). Feature n is the number of units recorded over eight sessions that were used by the decoder. Solid black lines show the decoder mean performance when decoding the visual cue and dark gray lines show the decoder mean performance when decoding the audio cue. Shaded areas show SE for decoding. Red circles indicate significant (p < 0.01) deviation of decoding performance compared with chance level (red line). Vertical lines indicate the onset of cue and response phases.
Figure 7.
Figure 7.
Off-line decoding analysis of the RPS, RPSLS, and GT tasks. Decoding performance for each recording for the RPS (a), RPSLS (c), and GT (e) tasks. Red line indicates chance level for each task. Example confusion matrices for recordings from the RPS (b), RPSLS (d), and GT (f) tasks. Examples are marked as red dots in the performance plots.
Figure 8.
Figure 8.
Neuron-dropping analysis. Neuron-dropping curves are shown for the RPS (a) and RPSLS (b) tasks using the 1500 ms time window (see Materials and Methods). Neuron-dropping curves are shown for the AIP (solid line) and BA5 (dashed line) arrays separately. Red line shows chance level for the task. Shaded areas show SD. Each data point was created using a 1000-fold cross-validation (see text). Separate neuron-dropping analyses for the cue (c) and response (d) phases used 400 ms time windows aligned to the corresponding phase (see “Neuron-dropping analysis”).

Similar articles

Cited by

References

    1. Aflalo T, Kellis S, Klaes C, Lee B, Shi Y, Pejsa K, Shanfield K, Hayes-Jackson S, Aisen M, Heck C, Liu C, Andersen RA. Decoding motor imagery from the posterior parietal cortex of a tetraplegic human. Science. 2015;348:906–910. doi: 10.1126/science.aaa5417. - DOI - PMC - PubMed
    1. Andersen RA, Buneo CA. Intentional maps in posterior parietal cortex. Annu Rev Neurosci. 2002;25:189–220. doi: 10.1146/annurev.neuro.25.112701.142922. - DOI - PubMed
    1. Andersen RA, Bracewell RM, Barash S, Gnadt JW, Fogassi L. Eye position effects on visual, memory, and saccade-related activity in areas LIP and 7a of macaque. J Neurosci. 1990;10:1176–1196. - PMC - PubMed
    1. Anderson KD. Targeting recovery: priorities of the spinal cord-injured population. J Neurotrauma. 2004;21:1371–1383. doi: 10.1089/neu.2004.21.1371. - DOI - PubMed
    1. Baumann MA, Fluet MC, Scherberger H. Context-specific grasp movement representation in the macaque anterior intraparietal area. J Neurosci. 2009;29:6436–6448. doi: 10.1523/JNEUROSCI.5479-08.2009. - DOI - PMC - PubMed

Publication types

LinkOut - more resources