Generic semi-supervised adversarial subject translation for sensor-based activity recognition - Université Paris-Est-Créteil-Val-de-Marne Access content directly
Journal Articles Neurocomputing Year : 2022

Dates and versions

hal-04030634 , version 1 (15-03-2023)

Identifiers

Cite

Elnaz Soleimani, Ghazaleh Khodabandelou, Abdelghani Chibani, Yacine Amirat. Generic semi-supervised adversarial subject translation for sensor-based activity recognition: Performance of Human Activity Recognition (HAR) models, particularly deep neural networks, is highly contingent upon the availability of the massive amount of annotated training data. Though, data collection and manual labeling in the HAR domain are prohibitively expensive due to human resource dependence in both steps. Hence, domain adaptation techniques are proposed to adapt the knowledge from the existing source of data. More recently, adversarial transfer learning methods have shown promising results for visual classification, yet limited for HAR problems, which are still prone to the unfavorable effects of the imbalanced distribution of samples. This paper presents a novel generic semi-supervised approach that takes advantage of the adversarial framework to tackle these shortcomings by leveraging knowledge from annotated samples exclusively from the source subject and unlabeled ones of the target subject. An extensive subject translation experiments is conducted on three large, middle, and small-size datasets with different levels of imbalance to assess the robustness of the proposed model to the scale as well as the imbalance in the data. The results demonstrate the effectiveness of our proposed algorithms over state-of-the-art methods, which led to up to 13%, 4%, and 13% improvement of our high-level activities recognition metrics for Opportunity, LISSI, and PAMAP2 datasets, respectively.. Neurocomputing, 2022, 500, pp.649-661. ⟨10.1016/j.neucom.2022.05.075⟩. ⟨hal-04030634⟩

Collections

LISSI UPEC
34 View
0 Download

Altmetric

Share

Gmail Facebook X LinkedIn More