Our ability to interact with the immediate surroundings depends not only on an adequate representation of external space but also on our ability to represent the location of objects with respect to our own body and especially to our hands. Indeed, electrophysiological studies in monkeys revealed multimodal neurons with spatially corresponding tactile and visual receptive fields in a number of brain areas, suggesting a representation of visual peripersonal space with respect to the body. In this functional magnetic resonance imaging study, we localized areas in human intraparietal sulcus (IPS) and lateral occipital complex (LOC) that represent nearby visual space with respect to the hands (perihand space), by contrasting the response to a ball moving near-to versus far-from the hands. Furthermore, by independently manipulating sensory information about the hand, in the visual (using a dummy hand) and proprioceptive domains (by changing the unseen hand position), we determined the sensory contributions to the representation of hand-centered space. In the posterior IPS, the visual contribution was dominant, overriding proprioceptive information. Surprisingly, regions within LOC also displayed visually dominant, hand-related activation. In contrast, the anterior IPS was characterized by a proprioceptive representation of the hand, as well as showing tactile hand-specific activation, suggesting a homology with monkey parietal hand-centered areas. We therefore suggest that, whereas cortical regions within the posterior IPS and LOC represent hand-centered space in a predominantly visual manner, the anterior IPS uses multisensory information in representing perihand space.