Research Projects
1. Eye Gaze Estimation with Limited Supervision: [PhD Thesis]
Automatic eye gaze estimation has interested researchers for a while now. The most promising eye gaze estimation techniques either require specialized hardware or use supervised machine learning for image processing solutions. The hardware-based methods require device calibration and operator assistance. The supervised-learning based methods require a bulk of labelled data for training. Labelling human gaze behaviour analysis data is a complex and time-consuming task. In this project, I will mainly exploit domain knowledge to labels the raw data in a generic way. Mainly it will be on representation learning in a self-supervised manner using deep learning techniques.
Research Outcome:
Shreya Ghosh, Abhinav Dhall, Munawar Hayat, Jarrod Knibbe and Qiang Ji. Automatic Gaze Analysis: A Survey of Deep Learning based Approaches. TPAMI 2023 (link)[Github]
Shreya Ghosh, Abhinav Dhall, Jarrod Knibbe and Munawar Hayat. Labelling the Gaps: A Weakly Supervised Automatic Eye Gaze Estimation. ACCV 2022 (Accepted).
Shreya Ghosh, Abhinav Dhall, Munawar Hayat and Jarrod Knibbe. AV-Gaze: A Study on the Effectiveness of Audio Guided Gaze Estimation for Non-Profilic Faces. In ICIP 2022. [Link]
Shreya Ghosh, Munawar Hayat, Abhinav Dhall and Jarrod Knibbe. MTGLS: Multi-Task Gaze Estimation with Limited Supervision. In WACV 2022. (Accepted) [Link] [Video]
Shreya Ghosh, Abhinav Dhall, Garima Sharma, Sarthak Gupta and Nicu Sebe. Speak2Label: Using Domain Knowledge for Creating a Large Scale Driver Gaze Zone Estimation Dataset. In ICCVW 2021. (link) [Project Page] [Demo]
Neeru Dubey, Shreya Ghosh, Abhinav Dhall (2019). Learning a Rich Eye Gaze Representation with an Unsupervised Technique. In IJCNN 2019 [Project Page]
Neeru Dubey, Shreya Ghosh, Abhinav Dhall. A Self-supervised Approach for Learning Eye Gaze Representation. (Under Review)
2. Group Affect Analysis: [Masters Thesis]
Group Level Emotion:
Understanding of emotion of a group of people in an image or video is an imporant problem. For details on the Group-level emotion recognition, a talk by Prof. Goecke is available on video lectures at: http://videolectures.net/fgconference2015_goecke_people_images/
Group Level Cohesion:
Cohesiveness of a group is an essential indicator of emotional state, structure and success of a group of people. We study the factors that influence the perception of group level cohesion and propose methods for estimating the human-perceived cohesion on the Group Cohesiveness Scale (GCS). Based on the Group Affect database, we add GCS and propose the 'GAF-Cohesion database'. It is interesting to note that GCS as an attribute, when jointly trained for group level emotion prediction, helps in increasing the performance for the later task. This suggests that group level emotion and GCS are correlated.
Most Influential Person:
Group affect analysis is an important cue for predicting various group traits. Generally, the estimation of the group affect, emotional responses, eye gaze and position of people in images are the important cues to identify an important person from a group of people. The main focus of this study is to explore the importance of group affect in finding the representative of a group. We call that person the" Most Influential Person"(for the first impression) or" leader" of a group. In order to identify the main visual cues for" Most Influential Person", we conducted a user survey. Based on the survey statistics, we annotate the" influential persons" in 1000 images of Group AFfect database (GAF 2.0) via LabelMe toolbox and propose the" GAF-personage database".
Research Outcome:
Shreya Ghosh, Abhinav Dhall, Nicu Sebe, Tom Gedeon. Automatic Prediction of Group Cohesiveness in Images. In IEEE Transactions on Affective Computing. [Link]
Abhinav Dhall, Roland Goecke, Shreya Ghosh and Tom Gedeon (2019). EmotiW 2019: Automatic Emotion, Engagement and Cohesion Prediction Tasks. In ACM-ICMI [link] [Page]
Garima Sharma, Shreya Ghosh and Abhinav Dhall (2019). Automatic Group Level Affect and Cohesion Prediction in Videos. In ACII EMERGent 2019 [Project Page]
Shreya Ghosh, Abhinav Dhall, Nicu Sebe, Tom Gedeon (2019). Predicting Group Cohesiveness in Images. In IJCNN 2019 [link] [Project Page]
Shreya Ghosh, Abhinav Dhall (2018). Role of group level affect to find the most influential person in images. In ECCV workshop (HBUGEN 2018)[link] [Github] [Project Page]
Shreya Ghosh, Abhinav Dhall, Nicu Sebe (2018). Automatic group affect analysis in images via visual attribute and feature networks. IEEE-ICIP [link] [Github] [Project Page] [Demo1, Demo2]
Abhinav Dhall, Roland Goecke, Shreya Ghosh, Jyoti Joshi, Jesse Hoey, Tom Gedeon (2017). From individual to group-level emotion recognition: emotiw 5.0. In ACM-ICMI [link] [Github] [Project Page]
3. Pain Estimation from Body Sensor:
Automatic chronic pain assessment and pain intensity estimation has been attracting growing attention due to its widespread applications. One of the prevalent issues in automatic pain analysis is inadequate balanced expert-labelled data for pain estimation. This work proposes an anomaly detection based network addressing one of the existing limitations of automatic pain assessment. The evaluation of the network is performed on pain intensity recognition and protective behaviour prediction tasks from body movements in the EmoPain Challenge dataset. The EmoPain dataset consists of body part based sensor data for both the tasks. The proposed network is a lightweight LSTM-DNN model which considers features based on sensor data as the input and predicts intensity level of pain and presence or absence of protective behaviour in chronic low back pain patients. Joint training considering body movement patterns, such as exercise type, corresponding to pain exhibition as a label improves the performance of the network. However, contrary to perception, protective behaviour rather exists sporadically alongside pain in the EmoPain dataset. This induces yet another complication in accurate prediction of protective behaviour. This problem is resolved by incorporating anomaly detection in the network. A detailed comparison of different models with varied features is outlined in the paper presenting a significant improvement with the final proposed anomaly detection based network.
Research Outcome:
Shreya Ghosh (2021). Automatic Pain Intensity and Protective Behaviour Prediction. Extended Abstract at Indo-Canadian Conference on Artificial Intelligence and Rehabilitation Robotics. (Website, YouTube)
Shreya Ghosh*, Yi Li* and Jyoti Joshi (2021). PLAAN: Pain Level Assessment with Anomaly-detection based Network. In Journal on Multimodal User Interfaces. In Journal on Multimodal User Interface. [Link]. *contributed equally.
Yi Li, Shreya Ghosh, Jyoti Joshi and Sharon Oviatt (2020). LSTM-DNN based Approach for Pain Intensity and Protective Behaviour Prediction. In IEEE FG workshop. [Link]
4. Depression Intensity Estimation from Social Media:
Nowadays, depression has become a major reason of suicide, specially among the teenagers. Generally, clinical psychologists diagnose depressed people via face to face interviews following the clinical depression criteria. However, often patients tend to not consult doctors in their early stages of depression. These days people are increasingly using social media to express their mood.It has been recently found quite effective in detecting early symptoms of depression. In this paper, we aim to predict depressed users as well as estimate their depression intensity via leveraging social media (Twitter), in order to aid in raising an alarm. To this end, we weakly label the data in a self-supervised manner, extract a rich set of features, and train a small LSTM (Long Short-Term Memory) network to predict the depression intensities. We perform extensive experiments to establish the efficacy of our method.
Research Outcome:
Shreya Ghosh, Tarique Anwar. Depression Intensity Estimation Via Social Media: A Deep Learning Based Approach. In IEEE Transactions on Computational Social Systems 2021 (Accepted). [Link]
*contributed equally.