Neural tracking of the musical beat is enhanced by low-frequency sounds
- PMID: 30037989
- PMCID: PMC6094140
- DOI: 10.1073/pnas.1801421115
Neural tracking of the musical beat is enhanced by low-frequency sounds
Abstract
Music makes us move, and using bass instruments to build the rhythmic foundations of music is especially effective at inducing people to dance to periodic pulse-like beats. Here, we show that this culturally widespread practice may exploit a neurophysiological mechanism whereby low-frequency sounds shape the neural representations of rhythmic input by boosting selective locking to the beat. Cortical activity was captured using electroencephalography (EEG) while participants listened to a regular rhythm or to a relatively complex syncopated rhythm conveyed either by low tones (130 Hz) or high tones (1236.8 Hz). We found that cortical activity at the frequency of the perceived beat is selectively enhanced compared with other frequencies in the EEG spectrum when rhythms are conveyed by bass sounds. This effect is unlikely to arise from early cochlear processes, as revealed by auditory physiological modeling, and was particularly pronounced for the complex rhythm requiring endogenous generation of the beat. The effect is likewise not attributable to differences in perceived loudness between low and high tones, as a control experiment manipulating sound intensity alone did not yield similar results. Finally, the privileged role of bass sounds is contingent on allocation of attentional resources to the temporal properties of the stimulus, as revealed by a further control experiment examining the role of a behavioral task. Together, our results provide a neurobiological basis for the convention of using bass instruments to carry the rhythmic foundations of music and to drive people to move to the beat.
Keywords: EEG; frequency tagging; low-frequency sound; rhythm; sensory-motor synchronization.
Copyright © 2018 the Author(s). Published by PNAS.
Conflict of interest statement
The authors declare no conflict of interest.
Figures
Comment in
-
Reply to Novembre and Iannetti: Conceptual and methodological issues.Proc Natl Acad Sci U S A. 2018 Nov 20;115(47):E11004. doi: 10.1073/pnas.1815750115. Epub 2018 Nov 13. Proc Natl Acad Sci U S A. 2018. PMID: 30425177 Free PMC article. No abstract available.
-
Tagging the musical beat: Neural entrainment or event-related potentials?Proc Natl Acad Sci U S A. 2018 Nov 20;115(47):E11002-E11003. doi: 10.1073/pnas.1815311115. Epub 2018 Nov 13. Proc Natl Acad Sci U S A. 2018. PMID: 30425178 Free PMC article. No abstract available.
-
Reply to Rajendran and Schnupp: Frequency tagging is sensitive to the temporal structure of signals.Proc Natl Acad Sci U S A. 2019 Feb 19;116(8):2781-2782. doi: 10.1073/pnas.1820941116. Epub 2019 Jan 29. Proc Natl Acad Sci U S A. 2019. PMID: 30696761 Free PMC article. No abstract available.
-
Frequency tagging cannot measure neural tracking of beat or meter.Proc Natl Acad Sci U S A. 2019 Feb 19;116(8):2779-2780. doi: 10.1073/pnas.1820020116. Epub 2019 Jan 29. Proc Natl Acad Sci U S A. 2019. PMID: 30696762 Free PMC article. No abstract available.
Similar articles
-
Infants show enhanced neural responses to musical meter frequencies beyond low-level features.Dev Sci. 2023 Sep;26(5):e13353. doi: 10.1111/desc.13353. Epub 2022 Dec 8. Dev Sci. 2023. PMID: 36415027
-
EEG Frequency-Tagging and Input-Output Comparison in Rhythm Perception.Brain Topogr. 2018 Mar;31(2):153-160. doi: 10.1007/s10548-017-0605-8. Epub 2017 Nov 10. Brain Topogr. 2018. PMID: 29127530 Review.
-
Intracerebral evidence of rhythm transform in the human auditory cortex.Brain Struct Funct. 2017 Jul;222(5):2389-2404. doi: 10.1007/s00429-016-1348-0. Epub 2016 Dec 18. Brain Struct Funct. 2017. PMID: 27990557
-
What can we learn about beat perception by comparing brain signals and stimulus envelopes?PLoS One. 2017 Feb 22;12(2):e0172454. doi: 10.1371/journal.pone.0172454. eCollection 2017. PLoS One. 2017. PMID: 28225796 Free PMC article.
-
Exploring how musical rhythm entrains brain activity with electroencephalogram frequency-tagging.Philos Trans R Soc Lond B Biol Sci. 2014 Dec 19;369(1658):20130393. doi: 10.1098/rstb.2013.0393. Philos Trans R Soc Lond B Biol Sci. 2014. PMID: 25385771 Free PMC article. Review.
Cited by
-
Tagging the musical beat: Neural entrainment or event-related potentials?Proc Natl Acad Sci U S A. 2018 Nov 20;115(47):E11002-E11003. doi: 10.1073/pnas.1815311115. Epub 2018 Nov 13. Proc Natl Acad Sci U S A. 2018. PMID: 30425178 Free PMC article. No abstract available.
-
Pupil drift rate indexes groove ratings.Sci Rep. 2022 Jul 8;12(1):11620. doi: 10.1038/s41598-022-15763-w. Sci Rep. 2022. PMID: 35804069 Free PMC article.
-
Spatial and temporal (non)binding of audiovisual rhythms in sensorimotor synchronisation.Exp Brain Res. 2023 Mar;241(3):875-887. doi: 10.1007/s00221-023-06569-x. Epub 2023 Feb 14. Exp Brain Res. 2023. PMID: 36788141 Free PMC article.
-
When Musical Accompaniment Allows the Preferred Spatio-Temporal Pattern of Movement.Sports Med Int Open. 2021 Oct 4;5(3):E81-E90. doi: 10.1055/a-1553-7063. eCollection 2021 Dec. Sports Med Int Open. 2021. PMID: 34646934 Free PMC article.
-
Frequency tagging cannot measure neural tracking of beat or meter.Proc Natl Acad Sci U S A. 2019 Feb 19;116(8):2779-2780. doi: 10.1073/pnas.1820020116. Epub 2019 Jan 29. Proc Natl Acad Sci U S A. 2019. PMID: 30696762 Free PMC article. No abstract available.
References
-
- Essens PJ, Povel D-J. Metrical and nonmetrical representations of temporal patterns. Percept Psychophys. 1985;37:1–7. - PubMed
-
- Toiviainen P, Luck G, Thompson MR. Embodied meter: Hierarchical eigenmodes in music–induced movement. Music Percept. 2010;28:59–70.
Publication types
MeSH terms
LinkOut - more resources
Full Text Sources
Other Literature Sources
Miscellaneous
