Synthesizing Reverberation Impulse Responses from Audio Signals: Auto-Reverberation and Interactive Environments
In our paper we present a new method for creating reverberation impulse-like responses from any piece of source audio. By using cross-reverberation, impulse responses derived from one set of audio tracks are then applied to a live electric guitar. Next, with auto-reverberation, segments of audio are selected and processed to form an evolving sequence of reverberation conditions that are applied back onto the original source material: as a sound file play it is reverberated by itself. And finally, we use these impulse-like signals with a convolution-based virtual acoustic system to create novel virtual environments: spaces made from songs and other things! Full examples follow below this short excerpt of demos from our AES talk:
Full Cross-Reverberation Examples
Live electric guitar processed by impulses synthesized from Misirlou by Dick Dale:
Full Auto-Reverberation Examples
JS Bach “Air on a G-String” (arranged for marimba by the Horsholm Percussion & Marimba Ensemble) played through impulses synthesized from itself
Full Virtual Acoustic Environment Examples – Please use external speakers or headphones
Virtual Acoustic Environment Video 1: Misirlou (no eq-smoothing), Air on a G-String (no eq-smoothing), and actual room resonance:
Virtual Acoustic Environment Video 2: Misirlou (with eq-smoothing), Air on a G-String (with eq-smoothing), and actual room resonance:

Eoin F. Callery: Irish World Academy of Music and Dance, University of Limerick Jonathan S. Abel: Center for Computer Research in Music and Acoustics, Stanford University Kyle S. Spratt: Department of Mechanical Engineering, University of Texas at Austin