International Journal of Advanced Research in Computer and Communication Engineering

A monthly peer-reviewed online and print journal

ISSN Online 2278-1021
ISSN Print 2319-5940

Since 2012

Abstract: Our Paper represents a gesture human action recognition system to track the various actions performed by the humans and supports to the development of the Human Robot Interaction (HRI) interface. Initially, we use the Kinect sensor 2.0 to capture the depth image of the person and the colour image of the person is captured using the RGB Sensor. Secondly, the information captured from the image and the colour image is combined into one single image in order to calculate the Histogram of Oriented Gradient (HOG) Features. Finally, these features are converted in to actions that is been recognized by the robot. This system can recognize various types of actions performed by human beings such as walking, to wave left hand, to wave right hand, bowing, to hold left hand, and to hold right hand. To recognize the various gestures Neural Network (NN) classifiers are being used. Haar cascade classifiers are basically used to check the positive and negative match of the image. The complete movement executed by the robot depends on the actions performed by the human. The complete control of the system is totally dependent on the person standing in front of the Kinect sensor. This can be used as a real time application and can be implemented in the required situation as a human substitute.

Keywords: Kinect sensor 2.0, Human Robot Interaction (HRI) Interface, Histogram of Oriented Gradient (HOG), Neural Network (NN), human action recognition system, RGB Sensor, real time application and Haar cascade classifiers.


PDF | DOI: 10.17148/IJARCCE.2020.9658

Open chat
Chat with IJARCCE