Skip to main content

SensiLens - IoT-Enabled Smart Assistive Eyewear with Edge AI and Real-Time Obstacle Detection for th

Page 1

International Research Journal of Engineering and Technology (IRJET)

e-ISSN: 2395-0056

Volume: 12 Issue: 11 | Nov 2025

p-ISSN: 2395-0072

www.irjet.net

SensiLens - IoT-Enabled Smart Assistive Eyewear with Edge AI and Real-Time Obstacle Detection for the Visually Impaired Dr. Vinutha H P1, Vachan MN2, Mallikarjuna H S3, Pavan Kumar V M4, Kushikumar B5 1 Professor & Head, Dept. of Computer Science and Business Systems, Bapuji Institute of Engineering and

Technology, Davangere, Karnataka, India

2,3,4,5 Student, Dept. of Computer Science and Business Systems, Bapuji Institute of Engineering and Technology,

Davangere, Karnataka, India ---------------------------------------------------------------------***--------------------------------------------------------------------This paper presents "SensiLens," an offline, edge-AI-based Abstract - Navigating safely and independently in a complex

smart eyewear system. Unlike cloud-dependent alternatives, SensiLens integrates a camera, a single-board computer (Raspberry Pi), and an audio feedback mechanism to process data locally. The system provides real-time object detection (using YOLOv8), depth estimation (using DepthAnything V2), and auditory alerts, ensuring immediate feedback for safer navigation.

environment remains a significant challenge for individuals who are blind or visually impaired. Traditional mobility aids such as white canes or guide dogs offer basic assistance but fall short in providing real-time cognitive and environmental awareness. This paper proposes "SensiLens," a functional smart assistive eyewear system that leverages computer vision and artificial intelligence to enhance mobility. The system consists of a lightweight, wearable device equipped with a camera, onboard processing unit, and audio output capabilities. Using real-time video input, the system employs the YOLOv8 algorithm to detect obstacles and hazards in the user's immediate path. When an obstacle is identified, the system issues a clear audio warning through an integrated speaker, enabling users to take timely action. Additionally, the eyewear includes an AI-powered assistant that interprets visual information and responds to image-based queries, providing not only physical navigation support but also cognitive assistance.

1.1 Motivation The core motivation behind this project is to empower visually impaired individuals to live more independently. By integrating AI-driven computer vision with lightweight embedded systems, the proposed solution helps users perceive their environment through sound cues, enhancing spatial awareness and reaction time.

2. LITERATURE SURVEY Several studies have explored the use of technology to assist visually impaired individuals. The evolution of these systems has moved from simple sensor-based sticks to complex vision-based wearables.

Key Words: IoT, Assistive Technology, Smart Eyewear, YOLOv8, Edge AI, Obstacle Detection.

1. INTRODUCTION

Jeong et al. (2025) proposed a YOLOv8-based XR Smart Glasses system specifically for outdoor walking assistance in South Korea. Their system utilized Extended Reality (XR) hardware to detect walkways and transportation hazards. While highly effective for outdoor navigation, the reliance on expensive XR hardware limits its accessibility for the general population.

Visual impairment significantly affects an individual's ability to interact with their environment, often leading to a reliance on others or traditional mobility aids such as white canes and guide dogs. According to the World Health Organization (WHO), over 285 million people worldwide are visually impaired, with 39 million classified as blind. While traditional aids like the white cane are inexpensive and reliable for detecting ground-level obstacles, they suffer from significant limitations. They cannot detect "knee-tohead" level obstacles (such as hanging branches or signboards) or dynamic hazards like moving vehicles. Guide dogs, while effective, are expensive to train and maintain.

Okolo et al. (2025) developed a Smart Assistive Navigation System that combines ultrasonic sensors with object detection models. This multi-modal approach improves reliability; however, the integration of multiple sensor types increases the hardware bulk and power consumption, making it less comfortable for prolonged wear.

In the digital age, the convergence of Computer Vision and the Internet of Things (IoT) has paved the way for smart assistive technologies (SATs). These technologies aim to act as a "digital eye," providing real-time environmental data to the user. However, existing smart solutions often rely heavily on cloud processing, which introduces latency critical risk in navigation scenarios.

© 2025, IRJET

|

Impact Factor value: 8.315

In the domain of low-light assistance, Qazi and Batumalay (2025) introduced Smart Night-Vision Glasses utilizing LiDAR and IR sensors. This addressed a critical gap for users with night blindness. However, LiDAR sensors remain costprohibitive for low-cost consumer devices.

|

ISO 9001:2008 Certified Journal

|

Page 638


Turn static files into dynamic content formats.

Create a flipbook
SensiLens - IoT-Enabled Smart Assistive Eyewear with Edge AI and Real-Time Obstacle Detection for th by IRJET Journal - Issuu