Abstract: The increasing burden of anxiety and depression requires accessible and effective monitoring tools. This study presents an AI-powered mental health monitoring system that predicts mental states, integrating self-reported text, wearable data such as heart rate variability (HRV), and video-based emotion analyses. The physiological changes detected from wearable devices are evaluated by Random Forest, while facial emotional signs are detected using CNNs to process video inputs. A contextual RNN is employed in the processing of emotions in synthesized text and Cognitive Behavioral Therapy (CBT). In view of the findings, high individual-level stress, emotional patterns, and mental health risk could be identified through the prediction of a moderate to critical level of risk. This integrated model improves the accuracy, accessibility, and real-time tracking of mental health indicators. The accessible monitoring approach underpins its importance and value for improving early diagnosis of mental health and possibly warning clinicians about emerging symptoms. Classification models were developed using an annotated dataset, experimental outcomes, and machine learning techniques.

Keywords: Mental health monitoring, Wearable devices, HRV, Random Forest, Video emotion analysis, Cognitive Behavioral Therapy (CBT), Random Forest, Convolutional Neural Networks (CNNs), LSTM, Contextual AI Systems.


Downloads: PDF | DOI: 10.17148/IJARCCE.2025.1411150

How to Cite:

[1] Kajal Patel, Nidhi Bhavsar, Komal Dhule, Apeksha Waghmare, Manivannan Panchanatham, "AI-Driven Mental Health Monitoring Through Wearable Biometrics and Video Emotion Analysis," International Journal of Advanced Research in Computer and Communication Engineering (IJARCCE), DOI: 10.17148/IJARCCE.2025.1411150

Open chat
Chat with IJARCCE