Multimodal Multipart Learning for Action Recognition in Depth Videos
Research output: Journal Publications and Reviews (RGC: 21, 22, 62) › 21_Publication in refereed journal › peer-review
Author(s)
Related Research Unit(s)
Detail(s)
Original language | English |
---|---|
Article number | 7346486 |
Pages (from-to) | 2123-2129 |
Journal / Publication | IEEE Transactions on Pattern Analysis and Machine Intelligence |
Volume | 38 |
Issue number | 10 |
Online published | 2 Dec 2015 |
Publication status | Published - Oct 2016 |
Link(s)
Abstract
The articulated and complex nature of human actions makes the task of action recognition difficult. One approach to handle this complexity is dividing it to the kinetics of body parts and analyzing the actions based on these partial descriptors. We propose a joint sparse regression based learning method which utilizes the structured sparsity to model each action as a combination of multimodal features from a sparse set of body parts. To represent dynamics and appearance of parts, we employ a heterogeneous set of depth and skeleton based features. The proper structure of multimodal multipart features are formulated into the learning framework via the proposed hierarchical mixed norm, to regularize the structured features of each part and to apply sparsity between them, in favor of a group feature selection. Our experimental results expose the effectiveness of the proposed learning method in which it outperforms other methods in all three tested datasets while saturating one of them by achieving perfect accuracy.
Research Area(s)
- Action recognition, group feature selection, joint sparse regression, kinect, mixed norms, structured sparsity
Citation Format(s)
Multimodal Multipart Learning for Action Recognition in Depth Videos. / Shahroudy, Amir; Ng, Tian-Tsong; Yang, Qingxiong et al.
In: IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 38, No. 10, 7346486, 10.2016, p. 2123-2129.
In: IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 38, No. 10, 7346486, 10.2016, p. 2123-2129.
Research output: Journal Publications and Reviews (RGC: 21, 22, 62) › 21_Publication in refereed journal › peer-review