Skip to main page content
Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
, 145, 109-25

Virtual Reality to Maximize Function for Hand and Arm Rehabilitation: Exploration of Neural Mechanisms

Affiliations

Virtual Reality to Maximize Function for Hand and Arm Rehabilitation: Exploration of Neural Mechanisms

Alma S Merians et al. Stud Health Technol Inform.

Abstract

Stroke patients report hand function as the most disabling motor deficit. Current evidence shows that learning new motor skills is essential for inducing functional neuroplasticity and functional recovery. Adaptive training paradigms that continually and interactively move a motor outcome closer to the targeted skill are important to motor recovery. Computerized virtual reality simulations when interfaced with robots, movement tracking and sensing glove systems, are particularly adaptable, allowing for online and offline modifications of task based activities using the participant's current performance and success rate. We have developed a second generation system that can exercise the hand and the arm together or in isolation and provide for both unilateral and bilateral hand and arm activities in three-dimensional space. We demonstrate that by providing haptic assistance for the hand and arm and adaptive anti-gravity support, the system can accommodate patients with lower level impairments. We hypothesize that combining training in virtual environments (VE) with observation of motor actions can bring additional benefits. We present a proof of concept of a novel system that integrates interactive VE with functional neuroimaging to address this issue. Three components of this system are synchronized, the presentation of the visual display of the virtual hands, the collection of fMRI images and the collection of hand joint angles from the instrumented gloves. We show that interactive VEs can facilitate activation of brain areas during training by providing appropriately modified visual feedback. We predict that visual augmentation can become a tool to facilitate functional neuroplasticity.

Figures

Figure 1
Figure 1
a. Hand & Arm Training System using a CyberGlove and Haptic Master interface that provides the user with a realistic haptic sensation that closely simulates the weight and force found in upper extremity tasks. b. Hand & Arm Training System using a CyberGlove, a CyberGrasp and Flock of Birds electromagnetic trackers. c. Close view of the haptic interface in a bimanual task.
Figure 2
Figure 2
a. The piano trainer consists of a complete virtual piano that plays the appropriate notes as they are pressed by the virtual fingers. b. Placing Cups displays a three-dimensional room with a haptically rendered table and shelves. c. Reach/Touch is accomplished in the context of aiming /reaching type movements in a normal, functional workspace. d. The Hammer Task trains a combination of three dimensional reaching and repetitive finger flexion and extension. Targets are presented in a scalable 3D workspace. e. Catching Falling Objects enhances movement of the paretic arm by coupling its motion with the less impaired arm. f. Humming Bird Hunt depicts a hummingbird as it moves through an environment filled with trees, flowers and a river. g. The full screen displays a three-dimensional room containing three shelves and a table.
Figure 3
Figure 3
Trajectories of a representative subject performing single repetitions of the cup reaching simulation. a. The dashed line represents the subject’s performance without any haptic effects on Day 1 of training. The solid line represents the subjects performance with the trajectory stabilized by the damping effect and with the work against gravity decreased by the robot. Also note the collision with the haptically rendered shelf during this trial. b. The same subject’s trajectory while performing the cup placing task without haptic assistance following 9 days of training. Note the coordinated, up and over trajectory, consistent with normal performance of a real world placing task.
Figure 4
Figure 4
depicts the interaction between the subject and robot which is coordinated by on-line assistance algorithms. Figure 4a depicts the performance of a repetition of Reach and Touch. The dashed line plots the hand velocity over time. As the subject moves toward the target, the assistive force, depicted by the solid line, stays at a zero level unless the subject fails to reach the target within a predefined time window. As the subjects progress toward the target slows, the assistive force increases until progress resumes and then starts to decrease after velocity exceeds a predefined threshold value. Figure 4b and c describe two repetitions of the bilateral Catching Falling Objects simulation. Performance on Day 1 (b) requires Assistive Force from the robot (solid line) when the subject is unable to overcome gravity and move the arm towards the target (Active Force (dashed lines) dips below zero). Figure 4c represents much less assistance from the robot to perform the same task because the subject is able to exert active force throughout the task.
Figure 5
Figure 5
Left panel Subject’s view during fMRI experiment (top). The real hand in a 5DT glove is shown below that. Movement of the virtual hand can be generated as an exact representation of the real hand, or can be distorted to study action-observation interaction inside a virtual environment. Right panel. Observing finger sequences with the intention to imitate afterwards. Significant BOLD activity (p<.001) is rendered on an inflated cortical surface template. Arrows show activation in the dorsal premotor cortex, BA 5, rostral portion of the IPS, supramarginal gyrus, and (pre)supplementary motor area, likely associated with planning sequential finger movements.
Figure 6
Figure 6
A representative healthy subject (left panel) and a chronic stroke patient (right panel) performed a finger sequence with the RIGHT hand. The inset in the right panel shows the lesion location in the stroke patient, (see also (Merians et al. 2006)). For each subject, the panels show the activations that were significantly greater when viewing the corresponding finger motion of the LEFT more than the RIGHT virtual hand (i.e. activation related to ‘mirror’ viewing). Note that viewing the LEFT virtual hand led to significantly greater activation of the primary motor cortex IPSILATERAL to the moving hand (i.e. contralateral to the observed virtual hand) (see arrow). Significant BOLD activity (p<.01) is rendered on an inflated cortical surface template using Caret software.

Similar articles

See all similar articles

Cited by 16 PubMed Central articles

See all "Cited by" articles

Publication types

Feedback