Abstract: This project aims to develop an innovative approach for predicting user mental health by combining facial expression analysis and traditional questionnaires. Leveraging advancements in facial recognition technology and machine learning algorithms, the system interprets facial expressions to identify emotional states, while users concurrently complete standardized mental health questionnaires. The multimodal dataset, consisting of facial expression data and self-reported questionnaire responses, will be subjected to deep learning techniques to extract meaningful patterns and correlations. A panel for suggestions and improvement is also proposed.

Keywords: Automated, CV2, Face Detection, Recognition, Machine Learning, Mental Health


Downloads: PDF | DOI: 10.17148/IJARCCE.2024.13517

How to Cite:

[1] Supriya Jawale, Ashwini Varma, Sanjiri Kulkarni, Sayali Uchake, Akanksha Mogare, Radhika Badkhal, "MENTAL HEALTH PREDICTION VIA FACIAL EXPRESSION & QUESTIONNAIRE," International Journal of Advanced Research in Computer and Communication Engineering (IJARCCE), DOI: 10.17148/IJARCCE.2024.13517

Open chat
Chat with IJARCCE