Abstract: Drones can be defined as powered aerial vehicle that does not carry any human operator. As, remote control of a drone is difficult for operators having less technical knowledge which may take a long time for operator to gain control over the drone. Use of gestures over remote control is a unique method of gaining control over the drone, which makes it easy for an operator with less or no technical knowledge to gain control over the drone within few minutes. Gesture control of a drone has been implemented by various researchers and these methods are highly expensive as well as they are affected by various parameters like wavelength of light, unbalanced forces on accelerometers, bulky apparatus etc. The aim of the project is to develop a system that uses hand gestures as a method to control the flight of a drone. In this system, the drone’s absolute position is not being monitored or recorded. Instead, the drone is being told to move relative to its current position based on the detected motion of the user. In order to enable fully autonomous flight, an extended Kalman filter (EKF) based procedure is used to control and adjust all six DOF (degrees of freedom) of the drone. The EKF used the readings of the pre-mounted accelerometer and gyroscope sensors on the drone as well as a supplementary optical flow sensor and a time-of-flight (TOF) sensor. The estimator uses an extended aerodynamic model for the drone, where the sensor measurements are used to observe the full 3D airspeed. To detect the motion of the user, a nearfield sensor is measuring the disturbance of an electric field due to conductive objects, like a finger. Finally, to combine these systems, code will be developed on a Raspberry Pi to facilitate communication from the sensor to the drone and convert from the input X, Y, Z sensor values to the values compatible with the drone system.

Keywords: EKF, DOF, TOF, Gyroscopic.


PDF | DOI: 10.17148/IJARCCE.2023.125164

Open chat
Chat with IJARCCE