Autism classification and monitoring from predicted categorical and dimensional emotions of video features
Abstract
Autism in children has been increasing at an alarming rate over the years, and currently 1% of children struggle with this disorder. It can be better managed via early diagnosis and treatment. Autistic children are characterised by deficiencies in communicative and social capabilities and are most commonly identified by their stimming behaviours. Therefore, it is helpful to understand their emotions when they are exhibiting this type of behaviour. However, most of the current affect recognition approaches majorly focus on predicting either exclusively on basic categories of emotion, or continuous emotions. We propose an approach which maps basic categories of emotion to continuous dimensional emotions, opening more avenues for understanding emotions of autistic children. In our approach, we first predict the basic emotion category with a convolutional neural network, followed by continuous emotion prediction by a deep regression model. Moreover, our method is deployed as a web application for visual video monitoring. For autism analysis, we performed image-based and video-based classification of stimming behaviours using the extracted behavioural and emotional features. Our emotion classifier was able to achieve a competitive F1-score, while our regression model performed excellently in terms of CCC and RMSE compared with existing methods. Image-based analysis of autism did not yield meaningful classification when using emotional features but it provided useful cues when dealing with textural features. In video-based autism analysis, our chosen clustering algorithm was able to classify stimming behaviours into different clusters, each cluster demonstrating a dominant emotion category.