Abstract: Emotion-based video playing systems represent a burgeoning field of research aimed at enhancing user engagement and satisfaction. This paper introduces a novel approach to such systems, employing facial recognition technology to detect users' emotional states in real-time. By analyzing facial expressions, the system identifies emotional cues and selects appropriate video content tailored to the user's mood. We present a comprehensive framework that integrates deep learning techniques, particularly Convolutional Neural Networks (CNNs), for accurate emotion recognition. Furthermore, we propose a dynamic recommendation mechanism that continuously adapts to users' changing emotional states during video playback. Experimental evaluations on diverse datasets demonstrate the effectiveness and robustness of the proposed system, outperforming existing methods in terms of accuracy and user experience. This research paves the way for emotion-aware video playing systems that can intuitively respond to users' emotions, offering personalized and immersive viewing experiences.

Keywords: Haar Cascade, Convolutional Neural Networks (CNNs), Emotion-based, real-time.

Cite:
Asma Attar, Namratha N Murthy, Rakesh Sharma, Yashaswini K P, Mr.Shivaprasad T K"EMOTION DETECTION BASED VIDEO PLAYING SYSTEM USING ARTIFICIAL INTELLIGENCE", IJARCCE International Journal of Advanced Research in Computer and Communication Engineering, vol. 13, no. 3, 2024, Crossref https://doi.org/10.17148/IJARCCE.2024.13391.


PDF | DOI: 10.17148/IJARCCE.2024.13391

Open chat
Chat with IJARCCE