In this Perspective, we evaluate current progress in understanding how the brain encodes our sense of direction, within the context of parallel work focused on how early vestibular pathways encode self-motion. In particular, we discuss how these systems work together and provide evidence that they involve common mechanisms. We first consider the classic view of the head direction cell and results of recent experiments in rodents and primates indicating that inputs to these neurons encode multimodal information during self-motion, such as proprioceptive and motor efference copy signals, including gaze-related information. We also consider the paradox that, while the head-direction network is generally assumed to generate a fixed representation of perceived directional heading, this computation would need to be dynamically updated when the relationship between voluntary motor command and its sensory consequences changes. Such situations include navigation in virtual reality and head-restricted conditions, since the natural relationship between visual and extravisual cues is altered.