JIMALE, ALI OLOW (2022) Fully Connected Generative Adversarial Network for Human Activity Recognition. IEEE Access.
![[thumbnail of s12884-022-04987-3.pdf]](https://repository.simad.edu.so/style/images/fileicons/text.png)
s12884-022-04987-3.pdf - Published Version
Download (1MB)
Abstract
ABSTRACT Conditional Generative Adversarial Networks (CGAN) have shown great promise in generating
synthetic data for sensor-based activity recognition. However, one key issue concerning existing CGAN
is the design of the network architecture that affects sample quality. This study proposes an effective
CGAN architecture that synthesizes higher quality samples than state-of-the-art CGAN architectures. This
is achieved by combining convolutional layers with multiple fully connected networks in the generator’s
input and discriminator’s output of the CGAN. We show the effectiveness of the proposed approach using
elderly data for sensor-based activity recognition. Visual evaluation, similarity measure, and usability
evaluation are used to assess the quality of generated samples by the proposed approach and validate its
performance in activity recognition. In comparison to the state-of-the-art CGAN, the visual evaluation and
similarity measure demonstrate that the proposed models’ synthetic data more accurately represents actual
data and creates more variations in each synthetic data than the state-of-the-art approach respectively. The
experimental stages of the usability evaluation, on the other hand, show a performance gain of 2.5%, 2.5%,
3.1%, and 4.4% over the state-of-the-art CGAN when using synthetic samples by the proposed architecture.
14 INDEX TERMS Activity recognition, deep learning, generative adversarial network.
Item Type: | Article |
---|---|
Subjects: | A General Works > AC Collections. Series. Collected works |
Divisions: | Faculty of Computing |
Depositing User: | Unnamed user with email crd@smiad.edu.so |
Date Deposited: | 10 Sep 2025 12:36 |
Last Modified: | 10 Sep 2025 12:36 |
URI: | https://repository.simad.edu.so/id/eprint/49 |