Skip to main content

A Review on Real Time AI-Driven Virtual Hand Gesture Control

Page 1

International Research Journal of Engineering and Technology (IRJET)

e-ISSN: 2395-0056

Volume: 12 Issue: 12 | Dec 2025

p-ISSN: 2395-0072

www.irjet.net

A Review on Real Time AI-Driven Virtual Hand Gesture Control Prof. M. M. Thakur1, SAHIL DHANVIJ2, SIDDHARTH JAWADE3, AKANKSHA SATDEVE4, PRANAV KHAIRE5. 1Assistant Professor AI & DS Department K. D. K. COLLEGE OF ENGINEERING, Nagpur 2,3,4,5 K. D. K. COLLEGE OF ENGINEERING, Nagpur

-----------------------------------------------------------------------------***----------------------------------------------------------------------ABSTRACT- Real-time AI-driven virtual hand gesture control has emerged as a powerful interaction technique for nextgeneration human–computer interfaces. With advancements in computer vision, deep learning, and sensor technologies, gesture-based systems now offer touchless, intuitive, and immersive control for virtual environments. This paper presents an in-depth review of AI-based hand gesture recognition methods used for real-time applications such as virtual reality (VR), augmented reality (AR), gaming, robotics, smart homes, and assistive technologies. The study highlights state-of-the-art approaches including convolutional neural networks (CNNs), Media Pipe-based landmark detection, transformer models, and hybrid deep learning frameworks that enable accurate hand tracking and gesture classification. Real-time performance challenges—such as varying lighting conditions, occlusion, background noise, depth estimation, and computational efficiency—are discussed along with existing solutions. Furthermore, this review explores the role of optimization techniques, lightweight neural networks, and edge-AI deployment for improving speed and performance on low-power devices. The paper concludes by identifying research gaps and presenting future directions such as multimodal gesture sensing, 3D skeleton modelling, adaptive learning, and integration with wearable devices for enhanced virtual interaction. Overall, this work aims to provide a comprehensive understanding of real-time AI-driven virtual hand gesture control systems, their technological components, challenges, and potential applications in modern interactive systems. Keywords: Real-time gesture recognition, Hand tracking, Deep learning, Computer vision, AI-driven control, Gesture classification, Virtual interaction, Human–computer interaction, 3D hand pose estimation.

INTRODUCTION Hand gesture recognition has become one of the most intuitive and natural ways to interact with digital systems. As modern technology moves toward touchless and seamless communication, gesture-based interfaces offer users the ability to control devices through simple hand movements, eliminating the need for physical controllers. This transformation is mainly driven by advancements in artificial intelligence (AI), machine learning, and computer vision, which together enable computers to understand human gestures with high precision. Traditional gesture recognition methods relied heavily on specialized hardware such as sensor gloves, infrared cameras, or depth-sensing devices. Although these systems provided accurate results, they were expensive, complex, and not suitable for everyday applications. However, with the rise of deep learning and lightweight AI models, gesture recognition has become more accessible. Modern systems can recognize gestures using standard RGB cameras found in laptops and smartphones. Technologies such as Convolutional Neural Networks (CNNs), recurrent networks, transformer-based models, and real-time landmark detection frameworks like Media Pipe have made gesture tracking faster, more accurate, and easy to deploy. Real-time virtual hand gesture control is especially important for applications in virtual reality (VR), augmented reality (AR), robotics, smart homes, automotive interfaces, assistive devices, and gaming. In VR/AR systems, gesture recognition improves immersion by allowing users to interact naturally with virtual objects. In robotics, gesture-based commands allow safe and remote human–robot collaboration. Similarly, touchless interfaces have become increasingly important in healthcare and public environments where physical contact must be minimized.

© 2025, IRJET

|

Impact Factor value: 8.315

|

ISO 9001:2008 Certified Journal

|

Page 198


Turn static files into dynamic content formats.

Create a flipbook