PROJECT TITLE :
The widespread availability of digitized music collections and mobile music players have enabled us to listen to music during several of our daily activities, like exercise, commuting, relaxation, and many folks fancy this. A practical drawback that comes together with the want to listen to music is that of music retrieval, the choice of desired music from a music collection. During this paper, we have a tendency to propose a brand new approach to facilitate music retrieval. Trendy good phones are commonly used as music players and are already equipped with inertial sensors that are suitable for obtaining motion data. Within the proposed approach, emotion springs automatically from arm gestures and is used to question a music collection. We derive predictive models for valence and arousal from empirical data, gathered in an experimental setup where inertial knowledge recorded from arm movements are coupled to musical emotion. Half of the experiment is a preliminary study confirming that human subjects are generally capable of recognizing affect from arm gestures. Model validation in the main study confirmed the predictive capabilities of the models.
Did you like this research project?
To get this research project Guidelines, Training and Code... Click Here