Humans are exquisitely sensitive to changes in relative position. A fundamental and long-standing question is how information for position acuity is integrated along the length of the target, and why visual performance deteriorates when the feature separation increases. To address this question, we used a target made of discrete samples, each subjected to binary positional noise, combined with reverse correlation to estimate the behavioral "receptive field" (template), and a novel 10-pass method to quantify the internal noise that limits position acuity. Our results show that human observers weigh individual parts of the stimulus differently and importantly, that the shape of the template changes markedly with feature separation. Compared to an ideal observer, human performance is limited by a template that becomes less efficient as feature separation increases and by an increase in random internal noise. Although systematic internal noise is thought to be one of the important components limiting detection thresholds, we found that systematic noise is negligible in our position task.