Exploring Gradient Flow Based Saliency for DNN Model Compression

Research output: Chapters, Conference Papers, Creative and Literary Works (RGC: 12, 32, 41, 45)32_Refereed conference paper (with ISBN/ISSN)peer-review

View graph of relations

Related Research Unit(s)

Detail(s)

Original languageEnglish
Title of host publicationMM '21
Subtitle of host publicationProceedings of the 29th ACM International Conference on Multimedia
Place of PublicationNew York, NY
PublisherAssociation for Computing Machinery
Pages3238–3246
ISBN (Print)9781450386517
Publication statusPublished - Oct 2021

Publication series

NameMM - Proceedings of the ACM International Conference on Multimedia

Conference

Title29th ACM International Conference on Multimedia (MM 2021)
LocationHybrid (Onsite and Virtual)
PlaceChina
CityChengdu
Period20 - 24 October 2021

Abstract

Model pruning aims to reduce the deep neural network (DNN) model size or computational overhead. Traditional model pruning methods such as ℓpruning that evaluates the channel significance for DNN pay too much attention to the local analysis of each channel and make use of the magnitude of the entire feature while ignoring its relevance to the batch normalization (BN) and ReLU layer after each convolutional operation. To overcome these problems, we propose a new model pruning method from a new perspective of gradient flow in this paper. Specifically, we first theoretically analyze the channel's influence based on Taylor expansion by integrating the effects of BN layer and ReLU activation function. Then, the incorporation of the first-order Talyor polynomial of the scaling parameter and the shifting parameter in the BN layer is suggested to effectively indicate the significance of a channel in a DNN. Comprehensive experiments on both image classification and image denoising tasks demonstrate the superiority of the proposed novel theory and scheme. Code is available at https://github.com/CityU-AIM-Group/GFBS.

Research Area(s)

  • model pruning, convolutional neural networks, channel saliency

Bibliographic Note

Research Unit(s) information for this publication is provided by the author(s) concerned.

Citation Format(s)

Exploring Gradient Flow Based Saliency for DNN Model Compression. / Liu, Xinyu; Li, Baopu; Chen, Zhen; Yuan, Yixuan.

MM '21: Proceedings of the 29th ACM International Conference on Multimedia. New York, NY : Association for Computing Machinery, 2021. p. 3238–3246 (MM - Proceedings of the ACM International Conference on Multimedia).

Research output: Chapters, Conference Papers, Creative and Literary Works (RGC: 12, 32, 41, 45)32_Refereed conference paper (with ISBN/ISSN)peer-review