CLEESE: An open-source audio-transformation toolbox for data-driven experiments in speech and music cognition

PLoS One. 2019 Apr 4;14(4):e0205943. doi: 10.1371/journal.pone.0205943. eCollection 2019.

Abstract

Over the past few years, the field of visual social cognition and face processing has been dramatically impacted by a series of data-driven studies employing computer-graphics tools to synthesize arbitrary meaningful facial expressions. In the auditory modality, reverse correlation is traditionally used to characterize sensory processing at the level of spectral or spectro-temporal stimulus properties, but not higher-level cognitive processing of e.g. words, sentences or music, by lack of tools able to manipulate the stimulus dimensions that are relevant for these processes. Here, we present an open-source audio-transformation toolbox, called CLEESE, able to systematically randomize the prosody/melody of existing speech and music recordings. CLEESE works by cutting recordings in small successive time segments (e.g. every successive 100 milliseconds in a spoken utterance), and applying a random parametric transformation of each segment's pitch, duration or amplitude, using a new Python-language implementation of the phase-vocoder digital audio technique. We present here two applications of the tool to generate stimuli for studying intonation processing of interrogative vs declarative speech, and rhythm processing of sung melodies.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Cognition / physiology*
  • Female
  • Humans
  • Male
  • Music*
  • Speech / physiology*
  • Speech Perception / physiology*

Grants and funding

The funder (H2020 European Research Council) provided funding for the project to J-J.A (Award Number CREAM - 335536), but did not have any additional role in the study design, data collection and analysis, decision to publish, or preparation of the manuscript. Author J.J.B., affiliated as Independent Researcher, is registered as a sole-proprietorship company in France dedicated to research and development consulting. Its participation to this study was fully funded by the above-mentioned grant.