There is general agreement that nonverbal animals and humans endowed with language possess an evolutionary precursor system for representing and comparing numerical values. However, whether nonverbal numerical representations in human and nonhuman primates are quantitatively similar and whether linear or logarithmic coding underlies such magnitude judgments in both species remain elusive. To resolve these issues, we tested the numerical discrimination performance of human subjects and two rhesus monkeys (Macaca mulatta) in an identical delayed match-to-numerosity task for a broad range of numerosities from 1 to 30. The results demonstrate a noisy nonverbal estimation system obeying Weber's Law in both species. With average Weber fractions in the range of 0.51 and 0.60, nonverbal numerosity discriminations in humans and monkeys showed similar precision. Moreover, the detailed analysis of the performance distributions exhibited nonlinearly compressed numerosity representations in both primate species. However, the difference between linear and logarithmic scaling was less pronounced in humans. This may indicate a gradual transformation of a logarithmic to linear magnitude scale in human adults as the result of a cultural transformation process during the course of mathematical education.