On-demand deep model compression for mobile devices : A usage-driven model selection framework

Research output: Chapters, Conference Papers, Creative and Literary WorksRGC 32 - Refereed conference paper (with host publication)peer-review

161 Scopus Citations
View graph of relations

Author(s)

  • Sicong Liu
  • Kaiming Nan
  • Yingyan Lin
  • Hui Liu
  • Junzhao Du

Detail(s)

Original languageEnglish
Title of host publicationMobiSys 2018 - Proceedings of the 16th ACM International Conference on Mobile Systems, Applications, and Services
PublisherAssociation for Computing Machinery, Inc
Pages389-400
ISBN (print)9781450357203
Publication statusPublished - 10 Jun 2018
Externally publishedYes

Publication series

NameMobiSys 2018 - Proceedings of the 16th ACM International Conference on Mobile Systems, Applications, and Services

Conference

Title16th ACM International Conference on Mobile Systems, Applications, and Services,MobiSys 2018
PlaceGermany
CityMunich
Period10 - 15 June 2018

Abstract

Recent research has demonstrated the potential of deploying deep neural networks (DNNs) on resource-constrained mobile platforms by trimming down the network complexity using different compression techniques. The current practice only investigate stand-alone compression schemes even though each compression technique may be well suited only for certain types of DNN layers. Also, these compression techniques are optimized merely for the inference accuracy of DNNs, without explicitly considering other application-driven system performance (e.g. latency and energy cost) and the varying resource availabilities across platforms (e.g. storage and processing capability). In this paper, we explore the desirable tradeoff between performance and resource constraints by user-specified needs, from a holistic system-level viewpoint. Specifically, we develop a usage-driven selection framework, referred to as AdaDeep, to automatically select a combination of compression techniques for a given DNN, that will lead to an optimal balance between user-specified performance goals and resource constraints. With an extensive evaluation on five public datasets and across twelve mobile devices, experimental results show that AdaDeep enables up to 9.8× latency reduction, 4.3× energy efficiency improvement, and 38× storage reduction in DNNs while incurring negligible accuracy loss. AdaDeep also uncovers multiple effective combinations of compression techniques unexplored in existing literature. © 2018 Association for Computing Machinery.

Research Area(s)

  • Deep learning, Deep reinforcement learning, Model compression

Bibliographic Note

Publication details (e.g. title, author(s), publication statuses and dates) are captured on an “AS IS” and “AS AVAILABLE” basis at the time of record harvesting from the data source. Suggestions for further amendments or supplementary information can be sent to [email protected].

Citation Format(s)

On-demand deep model compression for mobile devices: A usage-driven model selection framework. / Liu, Sicong; Nan, Kaiming; Lin, Yingyan et al.
MobiSys 2018 - Proceedings of the 16th ACM International Conference on Mobile Systems, Applications, and Services. Association for Computing Machinery, Inc, 2018. p. 389-400 (MobiSys 2018 - Proceedings of the 16th ACM International Conference on Mobile Systems, Applications, and Services).

Research output: Chapters, Conference Papers, Creative and Literary WorksRGC 32 - Refereed conference paper (with host publication)peer-review