International Research Journal of Engineering and Technology (IRJET)
e-ISSN: 2395-0056
Volume: 09 Issue: 05 | May 2022
p-ISSN: 2395-0072
www.irjet.net
Hand Gesture Control Robot Sachin Navghare1, Milind Jadhav2, Ashwin Bhagwat3, Prof. P.P Gaikwad4 Student of , Department of Electronics and Telecommunications Engineering, TSSM’s BSCOER, Pune, Maharashtra, India 4 Professor, Department of Electronics and Telecommunications Engineering, TSSM’s BSCOER, Pune, Maharashtra, India ---------------------------------------------------------------------***--------------------------------------------------------------------humans than a primitive text user interface or GUI (graphical Abstract- As more and more features are integrated into the 1,2,3
user interface). Using a mouse, it interacts naturally without any mechanical devices. The concept of gesture recognition allows you to point your finger at this point and move accordingly. This can result in traditional input on such devices, and can even be obsolete.
vehicle's human-machine interface (HMI), the processing of the device becomes more complex. Therefore, optimal use of various human sensory channels is an approach to simplify interaction with in-vehicle devices. With this idea, you can realize a Car robot that can be navigated wirelessly with the help of Arduino.
2. BLOCK DIAGRAM
Robots play a major role in our lives today. There are many types of robots, including wheeled robots, flying robots, and factory construction robots. The current way to control these robots is to use a keyboard, joystick, or pre-programmed commands. This project introduces a new way to control a robot using gestures. This project aims to build a remote control robot (car) that is remotely controlled using only gestures. The project consists of two components: a car and a control station. The control station is a computer with gesture recognition hardware, so it can recognize commands and send them to the car. The control station is an Arduino microcomputer. Hand movement data from the accelerometer is sent to the HT12E encoder via Arduino. The value is then transmitted with the help of the transmitter, where the receiver receives the value at the receiver part, where it is decoded by the decoder HT12D and transferred to the motor driver L293D. Therefore, the motor is controlled by the data received from the motor driver.
The robot operates on the output of an accelerometer (ADXL345) that records hand movements and sends data to the Arduino that assigns the correct voltage level. The information is then sent to the encoder (HT12E), ready for RF transmission. On the receiving side, the information is received by the RF receiver and passed to the decoder (HT12D). Data from the encoder is sent to the motor driver IC (L293D), which triggers motors of different configurations to move the bot in different specific directions.
Key Words: Gesture, accelerometer, gesture control, accelerometer control, manual robot
1. INTRODUCTION Gesture recognition is a topic of science and language technology aimed at interpreting human gestures through mathematical algorithms. Gestures can occur from any body movement or condition, but usually from the face or hands. Users can use simple gestures to control or operate the device without physically touching it. Many approaches have been developed using cameras and computer vision algorithms to interpret sign language. However, the identification and recognition of posture, walking, personal computers, and human behavior are also subject to gesture recognition techniques. Gesture recognition can be seen as a way for computers to understand the language of the human body. This builds a richer bridge between machines and
© 2022, IRJET
|
Impact Factor value: 7.529
2.1 CONDITIONS of OPERATION Requirements: The user keeps the accelerometer parallel to the ground. At this point, the signal from the accelerometer is sent to the Arduino and the robot stops moving. This state is called the stopped state here. Tilt Forward: When the user holds the accelerometer and the accelerometer tilts forward, the x, y, z axes are sent to the Arduino. If the x, y, and z axes meet the conditions x> * 250, y> = 20, z> = 0, the robot will move forward.
|
ISO 9001:2008 Certified Journal
|
Page 1632