PROJECT TITLE :
Egocentric Daily Activity Recognition via Multitask Clustering - 2015
Recognizing human activities from videos could be a basic research downside in computer vision. Recently, there was a growing interest in analyzing human behavior from knowledge collected with wearable cameras. First-person cameras continuously record several hours of their wearers' life. To cope with this vast amount of unlabeled and heterogeneous information, novel algorithmic solutions are needed. During this paper, we tend to propose a multitask clustering framework for activity of daily living analysis from visual data gathered from wearable cameras. Our intuition is that, even if the info don't seem to be annotated, it is possible to exploit the fact that the tasks of recognizing everyday activities of multiple individuals are connected, since usually folks perform the same actions in similar environments, e.g., people working in an office typically browse and write documents). In our framework, rather than clustering information from totally different users separately, we tend to propose to seem for clustering partitions that are coherent among related tasks. In specific, 2 novel multitask clustering algorithms, derived from a standard optimization drawback, are introduced. Our experimental analysis, conducted both on synthetic data and on publicly offered 1st-person vision information sets, shows that the proposed approach outperforms many single-task and multitask learning ways.
Did you like this research project?
To get this research project Guidelines, Training and Code... Click Here