FedAAR : A Novel Federated Learning Framework for Animal Activity Recognition with Wearable Sensors
Research output: Journal Publications and Reviews › RGC 21 - Publication in refereed journal › peer-review
Author(s)
Related Research Unit(s)
Detail(s)
Original language | English |
---|---|
Article number | 2142 |
Journal / Publication | Animals |
Volume | 12 |
Issue number | 16 |
Online published | 21 Aug 2022 |
Publication status | Published - Aug 2022 |
Link(s)
DOI | DOI |
---|---|
Attachment(s) | Documents
Publisher's Copyright Statement
|
Link to Scopus | https://www.scopus.com/record/display.uri?eid=2-s2.0-85137240928&origin=recordpage |
Permanent Link | https://scholars.cityu.edu.hk/en/publications/publication(a0117a73-07e7-4207-b765-c6cd658c9039).html |
Abstract
Deep learning dominates automated animal activity recognition (AAR) tasks due to high performance on large-scale datasets. However, constructing centralised data across diverse farms raises data privacy issues. Federated learning (FL) provides a distributed learning solution to train a shared model by coordinating multiple farms (clients) without sharing their private data, whereas directly applying FL to AAR tasks often faces two challenges: client-drift during local training and local gradient conflicts during global aggregation. In this study, we develop a novel FL framework called FedAAR to achieve AAR with wearable sensors. Specifically, we devise a prototype-guided local update module to alleviate the client-drift issue, which introduces a global prototype as shared knowledge to force clients to learn consistent features. To reduce gradient conflicts between clients, we design a gradient-refinement-based aggregation module to eliminate conflicting components between local gradients during global aggregation, thereby improving agreement between clients. Experiments are conducted on a public dataset to verify FedAAR’s effectiveness, which consists of 87,621 two-second accelerometer and gyroscope data. The results demonstrate that FedAAR outperforms the state-of-the-art, on precision (75.23%), recall (75.17%), F1-score (74.70%), and accuracy (88.88%), respectively. The ablation experiments show FedAAR’s robustness against various factors (i.e., data sizes, communication frequency, and client numbers).
Research Area(s)
- data privacy, animal behaviour, deep learning, distributed learning, client-drift, local gradient conflicts
Citation Format(s)
FedAAR: A Novel Federated Learning Framework for Animal Activity Recognition with Wearable Sensors. / Mao, Axiu; Huang, Endai; Gan, Haiming et al.
In: Animals, Vol. 12, No. 16, 2142, 08.2022.
In: Animals, Vol. 12, No. 16, 2142, 08.2022.
Research output: Journal Publications and Reviews › RGC 21 - Publication in refereed journal › peer-review
Download Statistics
No data available