Although we can often infer the mental states of others by observing their actions, there are currently no computational models of this remarkable ability. Here we develop a computational model of mental state inference that builds upon a generic visuomanual feedback controller, and implements mental simulation and mental state inference functions using circuitry that subserves sensorimotor control. Our goal is (1) to show that control mechanisms developed for manual manipulation are readily endowed with visual and predictive processing capabilities and thus allows a natural extension to the understanding of movements performed by others; and (2) to give an explanation on how cortical regions, in particular the parietal and premotor cortices, may be involved in such dual mechanism. To analyze the model, we simulate tasks in which an observer watches an actor performing either a reaching or a grasping movement. The observer's goal is to estimate the 'mental state' of the actor: the goal of the reaching movement or the intention of the agent performing the grasping movement. We show that the motor modules of the observer can be used in a 'simulation mode' to infer the mental state of the actor. The simulations with different grasping and non-straight line reaching strategies show that the mental state inference model is applicable to complex movements. Moreover, we simulate deceptive reaching, where an actor imposes false beliefs about his own mental state on an observer. The simulations show that computational elements developed for sensorimotor control are effective in inferring the mental states of others. The parallels between the model and cortical organization of movement suggest that primates might have developed a similar resource utilization strategy for action understanding, and thus lead to testable predictions about the brain mechanisms of mental state inference.