PROJECT TITLE :
Trajectory-based view-invariant hand gesture recognition by fusing shape and orientation
Traditional studies in vision-based mostly hand gesture recognition remain rooted in read-dependent representations, and hence users are forced to be fronto-parallel to the camera. To resolve this drawback, view-invariant gesture recognition aims to make the popularity result freelance of viewpoint changes. However, in current works the read-invariance is achieved at the value of mixing different gesture patterns that have similar trajectory curve form however different semantic meanings. For example, the gesture 'push’ can be mistaken as 'drag’ from another viewpoint. To address this shortcoming, in this study, the authors use a shape descriptor to extract the view-invariant features of a 3-dimensional (3D) trajectory. As the shape features are invariant to omnidirectional viewpoint changes, the orientation features are then added into weight different rotation angles so that similar trajectory shapes are higher separated. The proposed technique was conducted on two different databases, including a standard Australian Sign Language database and a challenging Kinect Hand Trajectory database. Experimental results show that the proposed algorithm achieves the next average recognition rate than the state-of-the-art approaches, and can better distinguish confusing gestures whereas meeting the view-invariant condition.
Did you like this research project?
To get this research project Guidelines, Training and Code... Click Here