Fusion of depth ,skeleton ,and inertial data for human action recognition - 2016 PROJECT TITLE : Fusion of depth ,skeleton ,and inertial data for human action recognition - 2016 ABSTRACT: This paper presents somebody's action recognition approach by the simultaneous deployment of a second generation Kinect depth sensor and a wearable inertial sensor. Three knowledge modalities consisting of depth images, skeleton joint positions, and inertial signals are fused by utilizing 3 collaborative illustration classifiers. A database consisting of ten actions performed by six subjects is put along to hold out 2 types of testing of the developed fusion approach: subject-generic and subject-specific. The general recognition rates obtained from each varieties of testing indicate recognition improvements when fusing all the info modalities compared to the situations when knowledge modalities are used individually. Did you like this research project? To get this research project Guidelines, Training and Code... Click Here facebook twitter google+ linkedin stumble pinterest Motion Estimation Human Action Recognition Sensor Fusion Wearable Inertial Sensor Fusion Of Depth Skeleton And Inertial Data Second Generation Kinect Depth Sensor Dwt based medical image fusion with maximum Local extreme - 2016 Saliency detection for stereoscopic images based on Depth confidence analysis and multiple cues fusion - 2016