Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2015 Dec 8;11(12):e1004649.
doi: 10.1371/journal.pcbi.1004649. eCollection 2015 Dec.

Biases in Visual, Auditory, and Audiovisual Perception of Space

Affiliations

Biases in Visual, Auditory, and Audiovisual Perception of Space

Brian Odegaard et al. PLoS Comput Biol. .

Abstract

Localization of objects and events in the environment is critical for survival, as many perceptual and motor tasks rely on estimation of spatial location. Therefore, it seems reasonable to assume that spatial localizations should generally be accurate. Curiously, some previous studies have reported biases in visual and auditory localizations, but these studies have used small sample sizes and the results have been mixed. Therefore, it is not clear (1) if the reported biases in localization responses are real (or due to outliers, sampling bias, or other factors), and (2) whether these putative biases reflect a bias in sensory representations of space or a priori expectations (which may be due to the experimental setup, instructions, or distribution of stimuli). Here, to address these questions, a dataset of unprecedented size (obtained from 384 observers) was analyzed to examine presence, direction, and magnitude of sensory biases, and quantitative computational modeling was used to probe the underlying mechanism(s) driving these effects. Data revealed that, on average, observers were biased towards the center when localizing visual stimuli, and biased towards the periphery when localizing auditory stimuli. Moreover, quantitative analysis using a Bayesian Causal Inference framework suggests that while pre-existing spatial biases for central locations exert some influence, biases in the sensory representations of both visual and auditory space are necessary to fully explain the behavioral data. How are these opposing visual and auditory biases reconciled in conditions in which both auditory and visual stimuli are produced by a single event? Potentially, the bias in one modality could dominate, or the biases could interact/cancel out. The data revealed that when integration occurred in these conditions, the visual bias dominated, but the magnitude of this bias was reduced compared to unisensory conditions. Therefore, multisensory integration not only improves the precision of perceptual estimates, but also the accuracy.

PubMed Disclaimer

Conflict of interest statement

The authors have declared that no competing interests exist.

Figures

Fig 1
Fig 1. Possible underlying mechanisms for the biases observed in subjects’ localization responses.
Zero on the x-axis represents the center. In panels (B) and (D), the likelihood and posterior functions are overlapping and are shown in blue. In the visual domain, (A) a prior distribution located at the center of visual space could draw localization estimates towards the center, or (B) the likelihood distributions themselves could be biased, drawing sensory estimates towards a central location. In the auditory domain, (C) a prior bias for periphery could push perception of peripheral target estimates further away from the center of space, or (D) the likelihood distributions themselves maybe biased away from the center. Various combinations of these computational mechanisms were tested in the different proposed models (see Model Comparisons section).
Fig 2
Fig 2. The spatial localization paradigm.
Stimuli could be presented from one of five locations, ranging from -13 to +13 degrees.
Fig 3
Fig 3. Biases present in localizations on unisensory trials.
A) Average biases across subjects. B) The distribution of subjects’ individual biases (based on each subjects’ mean across 15 trials) for each of the five locations, from -13 degrees in the leftmost column, to +13 degrees in the rightmost column. Visual biases are shown in blue, and auditory biases are shown in red. C) Positive numbers indicate rightward biases, and negative numbers indicate leftward biases. SEM bars (computed over all subjects) are shown around the mean bias in each modality for each location.
Fig 4
Fig 4. Visual and auditory biases on spatially congruent bisensory trials.
First, each subject’s average bias (over 15 trials) was computed for each of the five locations. Then, the mean across 384 subjects was calculated. Error bars represent standard error of the mean across subjects’ averages.
Fig 5
Fig 5. Bisensory trials, classified by inferring one common cause, or two independent causes.
Panel A shows average biases and panel B displays the distribution of subjects’ individual biases for each location. In Panel C, positive numbers indicate rightward biases, and negative numbers indicate leftward biases. SEM bars are shown around the mean bias in each modality for each location.
Fig 6
Fig 6. Standard deviations for all five stimulus positions.
(A) Unisensory visual (blue) and unisensory auditory (red) localizations, and (B) bisensory visual (blue) and bisensory auditory (red) localizations are shown for each modality.
Fig 7
Fig 7. Auditory-alone condition model fits for one randomly selected subject.
Plotted across the columns are model fits for each of the five auditory locations, from -13 to +13. The dotted light blue line shows the true stimulus location, the subject’s data is plotted in the shaded light blue regions, and the solid dark blue line shows the model’s fit to the data. As shown by the red arrows, simple models that assume unbiased sensory representations make considerable errors in estimating subjects’ response, as simulated response distributions are centered near the true stimulus locations. By allowing the sensory representations to vary, biases can be more fully accounted for, as is shown in the fits for the 8-parameter model in the second row.

Similar articles

Cited by

References

    1. Adam JJ, Davelaar EJ, van der Gouw A, Willems P. Evidence for attentional processing in spatial localization. Psychological Research. 2008. July 1;72(4):433–42. - PMC - PubMed
    1. Fortenbaugh FC, Robertson LC. When here becomes there: attentional distribution modulates foveal bias in peripheral localization. Atten Percept Psychophys. 2011. April 1;73(3):809–28. 10.3758/s13414-010-0075-5 - DOI - PMC - PubMed
    1. Mateeff S, Gourevich A. Peripheral vision and perceived visual direction. Biol Cybernetics. 1983. December 1;49(2):111–8. - PubMed
    1. Müsseler J, Heijden AHCVD, Mahmud SH, Deubel H, Ertsey S. Relative mislocalization of briefly presented stimuli in the retinal periphery. Perception & Psychophysics. 1999. December 1;61(8):1646–61. - PubMed
    1. Osaka N. Effect of Refraction on Perceived Locus of a Target in the Peripheral Visual Field. The Journal of Psychology. 1977;95(1):59–62. - PubMed

Publication types

Grants and funding

LS was supported by a grant from National Science Foundation 1057969. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.