Skip to main content

Gesture Recognition System

Page 1

International Research Journal of Engineering and Technology (IRJET)

e-ISSN: 2395-0056

Volume: 09 Issue: 05 | May 2022

p-ISSN: 2395-0072

www.irjet.net

Gesture Recognition System Shweta Hiremath1, Krishnakumar Chavan2, Pratik Kumar3, Moksha Bafna 4 1,2,3,4

Student, Dept. of Computer Engineering, Sinhgad College of Engineering, Maharashtra , India ---------------------------------------------------------------------***---------------------------------------------------------------------

Abstract - Gesture-based hand interaction is more natural and efficient than traditional interaction methods like keyboard,

mouse and pen. The identification of hand gestures has become most important aspects of human- computer interaction(HCI). The method of gesture recognition identifies the gesture and uses them to control and provide commands to computer. In this paper, the vision-based, glove-based, and depth-based techniques to hand gesture detection are briefly elaborated. Hand gesture recognition/detection is a fun project for a Computer Vision enthusiast to undertake since it contains an easy-tofollow step-by-step approach that allows you to create more sophisticated things on top of these notions. Key Words: Computer Vision, Human-Computer Interaction, Artificial Intelligence, Gesture Recognition, Image Processing. 1. INTRODUCTION Given the widespread availability of mobile devices with embedded cameras, such as smartphones and notebook, a gesture detection system can be a useful tool for interacting more intuitively with these camera-enabled devices than traditional interfaces. The trend toward embedded, ubiquitous computing in residential settings necessitates humancomputer interaction forms that are natural, convenient, and efficient. Hand gestures are indeed a fascinating area of study since they can aid communication and provide a natural mode ofengagement that can be applied to a numerous situations. The information gathered was then processed on a computer that was wired to the glove. Hand gestures for human– computer interaction (HCI) began with the introduction of the data glove sensor, which could be made portable by employing a sensor coupled to a micro- controller. By sensing the exact coordinates, gloves identify gestures. Curvature sensors, angular displacement sensors, optical fiber transducers, flex sensors, and accelerometer sensors were among the sensors that used the same technique dependant on angle of bending, different physical principles are used. Although the above mentioned procedures have yielded positive results, they have a number of drawbacks that make them inappropriate for the elderly, who may endure discomfort and ailments as a result of cable connection issues. Owing to the fact that real-time segmentation of foreground objects from a cluttered backdrop is a difficult challenge.The most obvious reason is that when a human looks at an image and a computer look at the same image, there is a semantic gap. Images are essentially 3- dimensional matrices to a computer, yet humans can readily figure out what's in them. As a result, computer vision difficulties continue to be a challenge. The proposed system architecture can recognize gestures and perform various real-time operations like virtual mouse control, zoom in-out, virtual keyboard control, drag and drop and volume control respectively. The dynamic gestures can interact with each other.

1.1 Aim and Objective The highest priority of the software is to design the Gesture Recognition module. Gesture Recognition module is the first main step of the software. Major goal of the software is to keep the memory use in reasonable degree. Since the aim of the software is the software can be implemented into mobile devices, memory use is the most critical points that the software will faced. The upper limit of the memory use will be 20% of the memory. The other important goal is the speed of the software. The process of the software, which is recognition of the gesture, processing it and providing the action, will be done in at most 0.2 second in order that the software will work in reasonable time.

© 2022, IRJET

|

Impact Factor value: 7.529

|

ISO 9001:2008 Certified Journal

|

Page 1210


Turn static files into dynamic content formats.

Create a flipbook