Automatic Soundscape Affect Recognition Using A Dimensional Approach 

Jianyu Fan, Miles Throgood, Philippe Pasquier

Abstract

Soundscape affect recognition is essential for sound designers and soundscape composers. Previous work demonstrated the effectiveness of predicting valence and arousal of soundscapes in responses from one expert user. Based on this, we present a method for the automatic soundscape affect recognition using ground truth data collected from an online survey. An analysis of the corpus shows that participants have a high level of agreement on the valence and arousal of soundscapes. We generate a gold standard by averaging users’ responses, and we verify the corpus by training stepwise linear regression models and support vector regression models. An analysis of the models shows our system obtains better results than the previous study. Further, we test the correlation between valence and arousal based on the gold standard. Last, we report an experiment of using arousal as a feature for predicting valence and vice versa.

Type : Journal Article

Publication : Journal of the Audio Engineering Society, vol. 64, no. 9, pp. 646-653, 2016 

 

© 2018 by Jianyu Fan.