Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation

Search Page

My NCBI Filters
Results by year

Table representation of search results timeline featuring number of search results per year.

Year Number of Results
1988 1
1989 1
1991 1
1992 4
1994 1
1995 5
1996 1
1997 3
1998 3
1999 2
2000 3
2001 11
2002 10
2003 29
2004 13
2005 37
2006 24
2007 22
2008 17
2009 43
2010 43
2011 47
2012 50
2013 54
2014 66
2015 52
2016 54
2017 48
2018 57
2019 50
2020 42
2021 51
2022 35
2023 11
Text availability
Article attribute
Article type
Publication date

Search Results

762 results
Results by year
Filters applied: . Clear all
Page 1
. 2023 Mar 29.
doi: 10.1002/hbm.26293. Online ahead of print.

The effect of background liked music on acute pain perception and its neural correlates

Affiliations

The effect of background liked music on acute pain perception and its neural correlates

Xuejing Lu et al. Hum Brain Mapp. .

Abstract

Music shows tremendous promise in pain relief, especially when considering its non-pharmacological nature. However, our understanding of the precise mechanisms behind music-induced analgesia (MIA) remains poor. The positive emotional state induced by music is one of the key components explaining MIA. To test this possibility and reveal its neural correlates, the present study applied nociceptive laser stimuli to 28 healthy participants when their liked or disliked songs were played as background music, or when they were resting in silence. Differences among conditions were quantified by self-reports of pain intensity and unpleasantness, as well as brain activations in response to acute laser stimuli. As expected, liked music significantly lowered pain ratings to acute painful stimuli compared to disliked music and no music. Consistent with this observation, brain activations in response to acute painful stimuli were deceased within brain areas encoding sensory components of pain, such as the right precentral and postcentral gyri (PreCG/PoCG), brain areas related to affective components of pain, such as the anterior cingulate cortex and bilateral putamen, and brain areas associated with motor control and avoidance reactions to pain, such as the left cerebellum, when liked music was played in the background in comparison to disliked music. Importantly, the relationship between music listening and differences in pain ratings of two music conditions was mediated by the magnitude of right PreCG/PoCG and left cerebellum activations. These findings deepened our understanding of the analgesic benefits of background liked music, a property relevant to clinical applications.

Keywords: emotional modulation; fMRI; liked background music; music-induced analgesia.

Supplementary info

Grant support
Proceed to details
. 2023 Mar 23;e13387.
doi: 10.1111/desc.13387. Online ahead of print.

Preliminary evidence for selective cortical responses to music in one-month-old infants

Affiliations

Preliminary evidence for selective cortical responses to music in one-month-old infants

Heather L Kosakowski et al. Dev Sci. .

Abstract

Prior studies have observed selective neural responses in the adult human auditory cortex to music and speech that cannot be explained by the differing lower-level acoustic properties of these stimuli. Does infant cortex exhibit similarly selective responses to music and speech shortly after birth? To answer this question, we attempted to collect functional magnetic resonance imaging (fMRI) data from 45 sleeping infants (2.0- to 11.9-weeks-old) while they listened to monophonic instrumental lullabies and infant-directed speech produced by a mother. To match acoustic variation between music and speech sounds we (1) recorded music from instruments that had a similar spectral range as female infant-directed speech, (2) used a novel excitation-matching algorithm to match the cochleagrams of music and speech stimuli, and (3) synthesized "model-matched" stimuli that were matched in spectrotemporal modulation statistics to (yet perceptually distinct from) music or speech. Of the 36 infants we collected usable data from, 19 had significant activations to sounds overall compared to scanner noise. From these infants, we observed a set of voxels in non-primary auditory cortex (NPAC) but not in Heschl's Gyrus that responded significantly more to music than to each of the other three stimulus types (but not significantly more strongly than to the background scanner noise). In contrast, our planned analyses did not reveal voxels in NPAC that responded more to speech than to model-matched speech, although other unplanned analyses did. These preliminary findings suggest that music selectivity arises within the first month of life. RESEARCH HIGHLIGHTS: Responses to music, speech, and control sounds matched for the spectrotemporal modulation-statistics of each sound were measured from 2- to 11-week-old sleeping infants using fMRI. Auditory cortex was significantly activated by these stimuli in 19 out of 36 sleeping infants. Selective responses to music compared to the three other stimulus classes were found in non-primary auditory cortex but not in nearby Heschl's Gyrus. Selective responses to speech were not observed in planned analyses but were observed in unplanned, exploratory analyses.

Keywords: auditory cortex; fMRI; infants; music; speech.

Supplementary info

Grant support
Proceed to details
. 2023 May 3;183:108524.
doi: 10.1016/j.neuropsychologia.2023.108524. Epub 2023 Mar 1.

Brain networks for temporal adaptation, anticipation, and sensory-motor integration in rhythmic human behavior

Affiliations
Free article

Brain networks for temporal adaptation, anticipation, and sensory-motor integration in rhythmic human behavior

Bronson B Harry et al. Neuropsychologia. .
Free article

Abstract

Human interaction often requires the precise yet flexible interpersonal coordination of rhythmic behavior, as in group music making. The present fMRI study investigates the functional brain networks that may facilitate such behavior by enabling temporal adaptation (error correction), prediction, and the monitoring and integration of information about 'self' and the external environment. Participants were required to synchronize finger taps with computer-controlled auditory sequences that were presented either at a globally steady tempo with local adaptations to the participants' tap timing (Virtual Partner task) or with gradual tempo accelerations and decelerations but without adaptation (Tempo Change task). Connectome-based predictive modelling was used to examine patterns of brain functional connectivity related to individual differences in behavioral performance and parameter estimates from the adaptation and anticipation model (ADAM) of sensorimotor synchronization for these two tasks under conditions of varying cognitive load. Results revealed distinct but overlapping brain networks associated with ADAM-derived estimates of temporal adaptation, anticipation, and the integration of self-controlled and externally controlled processes across task conditions. The partial overlap between ADAM networks suggests common hub regions that modulate functional connectivity within and between the brain's resting-state networks and additional sensory-motor regions and subcortical structures in a manner reflecting coordination skill. Such network reconfiguration might facilitate sensorimotor synchronization by enabling shifts in focus on internal and external information, and, in social contexts requiring interpersonal coordination, variations in the degree of simultaneous integration and segregation of these information sources in internal models that support self, other, and joint action planning and prediction.

Keywords: Error correction; Functional connectivity; Sensorimotor synchronization; Sensory-motor integration; Temporal prediction; fMRI.

Supplementary info

MeSH terms
Proceed to details
. 2023 Feb 9;14:1092051.
doi: 10.3389/fpsyg.2023.1092051. eCollection 2023.

Neural mechanisms of musical structure and tonality, and the effect of musicianship

Affiliations
Free PMC article

Neural mechanisms of musical structure and tonality, and the effect of musicianship

Lei Jiang et al. Front Psychol. .
Free PMC article

Abstract

Introduction: The neural basis for the processing of musical syntax has previously been examined almost exclusively in classical tonal music, which is characterized by a strictly organized hierarchical structure. Musical syntax may differ in different music genres caused by tonality varieties.

Methods: The present study investigated the neural mechanisms for processing musical syntax across genres varying in tonality - classical, impressionist, and atonal music - and, in addition, examined how musicianship modulates such processing.

Results: Results showed that, first, the dorsal stream, including the bilateral inferior frontal gyrus and superior temporal gyrus, plays a key role in the perception of tonality. Second, right frontotemporal regions were crucial in allowing musicians to outperform non-musicians in musical syntactic processing; musicians also benefit from a cortical-subcortical network including pallidum and cerebellum, suggesting more auditory-motor interaction in musicians than in non-musicians. Third, left pars triangularis carries out online computations independently of tonality and musicianship, whereas right pars triangularis is sensitive to tonality and partly dependent on musicianship. Finally, unlike tonal music, the processing of atonal music could not be differentiated from that of scrambled notes, both behaviorally and neurally, even among musicians.

Discussion: The present study highlights the importance of studying varying music genres and experience levels and provides a better understanding of musical syntax and tonality processing and how such processing is modulated by music experience.

Keywords: fMRI; hierarchical structure; informational connectivity; music; syntax; tonality.

Conflict of interest statement

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Proceed to details
. 2023 Jan 12;13(1):624.
doi: 10.1038/s41598-022-27361-x.

Neural decoding of music from the EEG

Affiliations
Free PMC article

Neural decoding of music from the EEG

Ian Daly. Sci Rep. .
Free PMC article

Abstract

Neural decoding models can be used to decode neural representations of visual, acoustic, or semantic information. Recent studies have demonstrated neural decoders that are able to decode accoustic information from a variety of neural signal types including electrocortiography (ECoG) and the electroencephalogram (EEG). In this study we explore how functional magnetic resonance imaging (fMRI) can be combined with EEG to develop an accoustic decoder. Specifically, we first used a joint EEG-fMRI paradigm to record brain activity while participants listened to music. We then used fMRI-informed EEG source localisation and a bi-directional long-term short term deep learning network to first extract neural information from the EEG related to music listening and then to decode and reconstruct the individual pieces of music an individual was listening to. We further validated our decoding model by evaluating its performance on a separate dataset of EEG-only recordings. We were able to reconstruct music, via our fMRI-informed EEG source analysis approach, with a mean rank accuracy of 71.8% ([Formula: see text], [Formula: see text]). Using only EEG data, without participant specific fMRI-informed source analysis, we were able to identify the music a participant was listening to with a mean rank accuracy of 59.2% ([Formula: see text], [Formula: see text]). This demonstrates that our decoding model may use fMRI-informed source analysis to aid EEG based decoding and reconstruction of acoustic information from brain activity and makes a step towards building EEG-based neural decoders for other complex information domains such as other acoustic, visual, or semantic information.

Conflict of interest statement

The authors declare no competing interests.

Supplementary info

Publication types, MeSH terms
Proceed to details
. 2023 Jan 7;S1550-8307(23)00001-0.
doi: 10.1016/j.explore.2023.01.001. Online ahead of print.

The effect of playing music and mother's voice to children on sedation level and requirement during pediatric magnetic resonance imaging

Affiliations

The effect of playing music and mother's voice to children on sedation level and requirement during pediatric magnetic resonance imaging

Özlem Öz Gergin et al. Explore (NY). .

Abstract

Background: Magnetic resonance imaging examinations frequently cause anxiety and fear in children. The objective of this study was to investigate the effects of listening to music sound, the mother's voice, and sound isolation on the depth of sedation and need for sedatives in pediatric patients who would undergo MRI.

Methods: Ninety pediatric patients aged 3 to 12 years who were planned for imaging in the MRI unit were randomly assigned to isolation group (Group I), musical sound group (Group II), and mother's voice group (Group III). We evaluated patients' anxiety and sedation levels via the Observer's Assessment of Alertness/Sedation (OAA/S) RESULTS: Heart rate, oxygen saturation, OAA/S, and Ramsey scores during the procedure were not significantly different among the groups (p>0.05). The mean amount of propofol and total propofol consumption was statistically lower in the mother's voice group than in the isolation and music sound groups (p<0.001). Mean propofol amount and total propofol consumption were not significantly different in isolation and music sound groups (p>0.05). No difference was found between the groups regarding the time it took for the patients' Modified Aldrete score to reach 9 (p>0.05).

Conclusions: In pediatric patients, listening to the mother's voice during MRI decreased the total sedative requirement consumed without increasing the depth of sedation.

Keywords: Complementary therapies; Magnetic resonance imaging; Maternal voice; Music; Sedation.

Proceed to details
. 2022 Dec 23;bhac501.
doi: 10.1093/cercor/bhac501. Online ahead of print.

Neuroimaging evidence for the direct role of auditory scene analysis in object perception

Affiliations

Neuroimaging evidence for the direct role of auditory scene analysis in object perception

Gennadiy Gurariy et al. Cereb Cortex. .

Abstract

Auditory Scene Analysis (ASA) refers to the grouping of acoustic signals into auditory objects. Previously, we have shown that perceived musicality of auditory sequences varies with high-level organizational features. Here, we explore the neural mechanisms mediating ASA and auditory object perception. Participants performed musicality judgments on randomly generated pure-tone sequences and manipulated versions of each sequence containing low-level changes (amplitude; timbre). Low-level manipulations affected auditory object perception as evidenced by changes in musicality ratings. fMRI was used to measure neural activation to sequences rated most and least musical, and the altered versions of each sequence. Next, we generated two partially overlapping networks: (i) a music processing network (music localizer) and (ii) an ASA network (base sequences vs. ASA manipulated sequences). Using Representational Similarity Analysis, we correlated the functional profiles of each ROI to a model generated from behavioral musicality ratings as well as models corresponding to low-level feature processing and music perception. Within overlapping regions, areas near primary auditory cortex correlated with low-level ASA models, whereas right IPS was correlated with musicality ratings. Shared neural mechanisms that correlate with behavior and underlie both ASA and music perception suggests that low-level features of auditory stimuli play a role in auditory object perception.

Keywords: RSA; auditory object perception; auditory scene analysis; neuroimaging.

Supplementary info

Grant support
Proceed to details
. 2023 Jan;57(2):324-350.
doi: 10.1111/ejn.15883. Epub 2022 Dec 26.

Speaking in gestures: Left dorsal and ventral frontotemporal brain systems underlie communication in conducting

Affiliations

Speaking in gestures: Left dorsal and ventral frontotemporal brain systems underlie communication in conducting

Mariacristina Musso et al. Eur J Neurosci. 2023 Jan.

Abstract

Conducting constitutes a well-structured system of signs anticipating information concerning the rhythm and dynamic of a musical piece. Conductors communicate the musical tempo to the orchestra, unifying the individual instrumental voices to form an expressive musical Gestalt. In a functional magnetic resonance imaging (fMRI) experiment, 12 professional conductors and 16 instrumentalists conducted real-time novel pieces with diverse complexity in orchestration and rhythm. For control, participants either listened to the stimuli or performed beat patterns, setting the time of a metronome or complex rhythms played by a drum. Activation of the left superior temporal gyrus (STG), supplementary and premotor cortex and Broca's pars opercularis (F3op) was shared in both musician groups and separated conducting from the other conditions. Compared to instrumentalists, conductors activated Broca's pars triangularis (F3tri) and the STG, which differentiated conducting from time beating and reflected the increase in complexity during conducting. In comparison to conductors, instrumentalists activated F3op and F3tri when distinguishing complex rhythm processing from simple rhythm processing. Fibre selection from a normative human connectome database, constructed using a global tractography approach, showed that the F3op and STG are connected via the arcuate fasciculus, whereas the F3tri and STG are connected via the extreme capsule. Like language, the anatomical framework characterising conducting gestures is located in the left dorsal system centred on F3op. This system reflected the sensorimotor mapping for structuring gestures to musical tempo. The ventral system centred on F3Tri may reflect the art of conductors to set this musical tempo to the individual orchestra's voices in a global, holistic way.

Keywords: conductors; dorsal pathway; gestures; language; musicians; sensory-motor integration; ventral pathway.

Supplementary info

MeSH terms
Proceed to details
. 2022 Dec 2;bhac469.
doi: 10.1093/cercor/bhac469. Online ahead of print.

A magnetoencephalography study of first-time mothers listening to infant cries

Affiliations

A magnetoencephalography study of first-time mothers listening to infant cries

N F Hoegholt et al. Cereb Cortex. .

Abstract

Studies using magnetoencephalography (MEG) have identified the orbitofrontal cortex (OFC) to be an important early hub for a "parental instinct" in the brain. This complements the finding from functional magnetic resonance imaging studies linking reward, emotion regulation, empathy, and mentalization networks to the "parental brain." Here, we used MEG in 43 first-time mothers listening to infant and adult cry vocalizations to investigate the link with mother-infant postpartum bonding scores and their level of sleep deprivation (assessed using both actigraphy and sleep logs). When comparing brain responses to infant versus adult cry vocalizations, we found significant differences at around 800-1,000 ms after stimuli onset in the primary auditory cortex, superior temporal gyrus, hippocampal areas, insula, precuneus supramarginal gyrus, postcentral gyrus, and posterior cingulate gyrus. Importantly, mothers with weaker bonding scores showed decreased brain responses to infant cries in the auditory cortex, middle and superior temporal gyrus, OFC, hippocampal areas, supramarginal gyrus, and inferior frontal gyrus at around 100-300 ms after the stimulus onset. In contrast, we did not find correlations with sleep deprivation scores. The significant decreases in brain processing of an infant's distress signals could potentially be a novel signature of weaker infant bonding in new mothers and should be investigated in vulnerable populations.

Keywords: cry perception; mother–infant relationship; neuroimaging; parenting; sleep.

Supplementary info

Grant support
Proceed to details
. 2023 Jan;37(1):70-79.
doi: 10.1177/02698811221125354. Epub 2022 Nov 26.

Changes in music-evoked emotion and ventral striatal functional connectivity after psilocybin therapy for depression

Affiliations
Free PMC article

Changes in music-evoked emotion and ventral striatal functional connectivity after psilocybin therapy for depression

Melissa Shukuroglou et al. J Psychopharmacol. 2023 Jan.
Free PMC article

Abstract

Background: Music listening is a staple and valued component of psychedelic therapy, and previous work has shown that psychedelics can acutely enhance music-evoked emotion.

Aims: The present study sought to examine subjective responses to music before and after psilocybin therapy for treatment-resistant depression, while functional magnetic resonance imaging (fMRI) data was acquired.

Methods: Nineteen patients with treatment-resistant depression received a low oral dose (10 mg) of psilocybin, and a high dose (25 mg) 1 week later. fMRI was performed 1 week prior to the first dosing session and 1 day after the second. Two scans were conducted on each day: one with music and one without. Visual analogue scale ratings of music-evoked 'pleasure' plus ratings of other evoked emotions (21-item Geneva Emotional Music Scale) were completed after each scan. Given its role in musical reward, the nucleus accumbens (NAc) was chosen as region of interest for functional connectivity (FC) analyses. Effects of drug (vs placebo) and music (vs no music) on subjective and FC outcomes were assessed. Anhedonia symptoms were assessed pre- and post-treatment (Snaith-Hamilton Pleasure Scale).

Results: Results revealed a significant increase in music-evoked emotion following treatment with psilocybin that correlated with post-treatment reductions in anhedonia. A post-treatment reduction in NAc FC with areas resembling the default mode network was observed during music listening (vs no music).

Conclusion: These results are consistent with current thinking on the role of psychedelics in enhancing music-evoked pleasure and provide some new insight into correlative brain mechanisms.

Keywords: Depression; functional connectivity; music; pleasure; psychedelic.

Conflict of interest statement

The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.

Supplementary info

Publication types, MeSH terms, Substances, Grant support
Proceed to details
762 results