Abstract: Visually impaired people often face difficulties in understanding their surroundings, especially while moving in unfamiliar environments. Identifying nearby objects, obstacles, or people usually requires external help, which limits independence. With the availability of smartphones equipped with cameras and processing power, it is possible to use artificial intelligence to provide real-time assistance. This paper presents AI Eyes for Visually Impaired, an Android-based mobile application designed to assist visually impaired users through real-time object detection and voice feedback. The application uses the mobile camera to continuously capture the surrounding environment and applies an AI-based object detection model to recognize common objects. Based on the detected objects, the system identifies their direction and estimates their approximate distance, and then communicates this information through audio guidance using text-to-speech. The proposed system runs entirely on the mobile device without requiring internet connectivity, making it portable and practical for daily use. The results show that the application can effectively improve environmental awareness and support safer navigation for visually impaired users.

Keywords: Artificial Intelligence, Object Detection, Assistive Technology, Visually Impaired, Android Application, TensorFlow Lite, Real-Time Navigation, Text-to-Speech.


Downloads: PDF | DOI: 10.17148/IJARCCE.2026.151132

How to Cite:

[1] Yogesh M, Dr. Madhu HK, "AI EYES: A REAL-TIME ASSISTIVE MOBILE APPLICATION for VISUALLY IMPAIRED PEOPLE USING OBJECT DETECTION," International Journal of Advanced Research in Computer and Communication Engineering (IJARCCE), DOI: 10.17148/IJARCCE.2026.151132

Open chat
Chat with IJARCCE