Investigating the Impact of Sound Angular Position on the Listener Affective State
Emotion recognition from sound signals represents an rising field of recent research. Although several existing Projects concentrate on emotion recognition from music, there looks to be a relative scarcity of analysis on emotion recognition from general sounds. One amongst the key characteristics of sound events is the sound supply spatial position, i.e. the situation of the supply comparatively to the acoustic receiver. Existing studies that aim to analyze the relation of the latter source placement and also the elicited emotions are restricted to distance, front and rear spatial localization and/or specific emotional categories. In this paper we analytically investigate the effect of the source angular position on the listener’s emotional state, modeled within the well-established valence/arousal affective house. Towards this aim, we have developed an annotated sound events dataset using binaural processed versions of the accessible International Affective Digitized Sound (IADS) sound events library. All subjective affective annotations were obtained using the Self Assessment Manikin (SAM) approach. Preliminary results obtained by processing these annotation scores are possible to indicate a scientific change in the listener affective state as the sound source angular position changes. This trend is a lot of obvious when the sound supply is found outside of the visible field of the listener.
Did you like this research project?
To get this research project Guidelines, Training and Code... Click Here