Hierarchical Control of Visually-Guided Movements in a 3D-Printed Robot Arm

Front Neurorobot. 2021 Oct 29;15:755723. doi: 10.3389/fnbot.2021.755723. eCollection 2021.

Abstract

The control architecture guiding simple movements such as reaching toward a visual target remains an open problem. The nervous system needs to integrate different sensory modalities and coordinate multiple degrees of freedom in the human arm to achieve that goal. The challenge increases due to noise and transport delays in neural signals, non-linear and fatigable muscles as actuators, and unpredictable environmental disturbances. Here we examined the capabilities of hierarchical feedback control models proposed by W. T. Powers, so far only tested in silico. We built a robot arm system with four degrees of freedom, including a visual system for locating the planar position of the hand, joint angle proprioception, and pressure sensing in one point of contact. We subjected the robot to various human-inspired reaching and tracking tasks and found features of biological movement, such as isochrony and bell-shaped velocity profiles in straight-line movements, and the speed-curvature power law in curved movements. These behavioral properties emerge without trajectory planning or explicit optimization algorithms. We then applied static structural perturbations to the robot: we blocked the wrist joint, tilted the writing surface, extended the hand with a tool, and rotated the visual system. For all of them, we found that the arm in machina adapts its behavior without being reprogrammed. In sum, while limited in speed and precision (by the nature of the do-it-yourself inexpensive components we used to build the robot from scratch), when faced with the noise, delays, non-linearities, and unpredictable disturbances of the real world, the embodied control architecture shown here balances biological realism with design simplicity.

Keywords: human movement; perceptual control theory; reaching; robot arm; tracking.