Abstract: Our paper introduces a gesture human action recognition system to monitor the various human actions and aids in the creation of the Human Robot Interaction (HRI) interface. The depth image of the person is initially acquired using the Kinect sensor 2.0, while the colour image is initially obtained using the RGB Sensor. The Histogram of Oriented Gradient (HOG) Features is calculated by combining the data from the image and the colour image into a single image. Finally, these traits are transformed into robot-recognizable actions. This system is capable of identifying a wide range of human behaviours, including walking, waving the left and right hands, bowing, and holding the left and right hands. Neural Network (NN) classifiers are being used to recognise the different gestures. In essence, Haar cascade classifiers are used to examine if a picture matches both positively and negatively. The actions taken by the human are what determine the robot's entire movement. The person standing in front of the Kinect sensor is completely responsible for controlling the device. This can be utilised as a real-time application and put into action as a human replacement where necessary.

Keywords: Kinect sensor 2.0, Human Robot Interaction (HRI) Interface, Histogram of Oriented Gradient (HOG), Neural Network (NN), human action recognition system, RGB Sensor, real time application and Haar cascade classifiers.


PDF | DOI: 10.17148/IJARCCE.2023.12117

Open chat
Chat with IJARCCE