Adequate behavioral responses to socially relevant stimuli are often impaired after lesions of the amygdala. Such lesions concern especially the recognition of facial and sometimes of vocal expression of emotions. Using low-noise functional magnetic resonance imaging (fMRI), we investigated in which way the amygdala, auditory cortex and insula are involved in the processing of affective nonverbal vocalizations (Laughing and Crying) in healthy humans. The same samples of male and female Laughing and Crying were presented in different experimental conditions: Simply listening to the stimuli, self-induction of the corresponding emotions while listening, and detection of artificial pitch shifts in the same stimuli. All conditions activated the amygdala similarly and bilaterally, whereby the amount of activation was larger in the right amygdala. The auditory cortex was more strongly activated by Laughing than by Crying with a slight right-hemisphere advantage for Laughing, both likely due to acoustic stimulus features. The insula was bilaterally activated in all conditions. The mean signal intensity change with stimulation was much larger in the amygdala than in auditory cortex and insula. The amygdala results seem to be in accordance with the right-hemisphere hypothesis of emotion processing which may not be applicable as strongly to the level of auditory cortex or insula.