TimbreFields: 3D interactive sound models for real-time audio

Richard Corbett, Kees Van Den Doel*, John E. Lioyd, Wolfgang Heidrich

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

17 Scopus citations

Abstract

We describe a methodology for virtual reality designers to capture and resynthesize the variations in sound made by objects when we interact with them through contact such as touch. The timbre of contact sounds can vary greatly, depending on both the listener's location relative to the object and the interaction point on the object itself. We believe that an accurate rendering of this variation greatly enhances the feeling of immersion in a simulation. To do this, we model the variation with an efficient algorithm based on modal synthesis. This model contains a vector field that is defined on the product space of contact locations and listening positions around the object The modal data are sampled on this high dimensional space using an automated measuring platform. A parameter-fitting algorithm is presented that recovers the parameters from a large set of sound recordings around objects and creates a continuous timbre field by interpolation. The model is subsequently rendered in a real-time simulation with integrated haptic, graphic, and audio display. We describe our experience with an implementation of this system and an informal evaluation of the results.

Original languageEnglish (US)
Pages (from-to)643-654
Number of pages12
JournalPresence: Teleoperators and Virtual Environments
Volume16
Issue number6
DOIs
StatePublished - Dec 2007
Externally publishedYes

ASJC Scopus subject areas

  • Software
  • Control and Systems Engineering
  • Human-Computer Interaction
  • Computer Vision and Pattern Recognition

Fingerprint

Dive into the research topics of 'TimbreFields: 3D interactive sound models for real-time audio'. Together they form a unique fingerprint.

Cite this