Back to Search Start Over

Saliency-maximized audio visualization and efficient audio-visual browsing for faster-than-real-time human acoustic event detection

Authors :
Mark Hasegawa-Johnson
Xiaodan Zhuang
Camille Goudeseune
Thomas S. Huang
Sarah King
Kai-Hsiang Lin
Source :
ACM Transactions on Applied Perception. 10:1-16
Publication Year :
2013
Publisher :
Association for Computing Machinery (ACM), 2013.

Abstract

Browsing large audio archives is challenging because of the limitations of human audition and attention. However, this task becomes easier with a suitable visualization of the audio signal, such as a spectrogram transformed to make unusual audio events salient. This transformation maximizes the mutual information between an isolated event's spectrogram and an estimate of how salient the event appears in its surrounding context. When such spectrograms are computed and displayed with fluid zooming over many temporal orders of magnitude, sparse events in long audio recordings can be detected more quickly and more easily. In particular, in a 1/10-real-time acoustic event detection task, subjects who were shown saliency-maximized rather than conventional spectrograms performed significantly better. Saliency maximization also improves the mutual information between the ground truth of nonbackground sounds and visual saliency, more than other common enhancements to visualization.

Details

ISSN :
15443965 and 15443558
Volume :
10
Database :
OpenAIRE
Journal :
ACM Transactions on Applied Perception
Accession number :
edsair.doi...........2075d44ac27ba5269bc4529ac16ae838
Full Text :
https://doi.org/10.1145/2536764.2536773