On the information of feature maps and pruning of deep neural networks

Mohammadreza Soltani, Suya Wu, Jie Ding, Robert Ravier, Vahid Tarokh

Research output: Chapter in Book/Report/Conference proceedingConference contribution

4 Scopus citations

Abstract

A technique for compressing deep neural models achieving competitive performance to state-of-the-art methods is proposed. The approach utilizes the mutual information between the feature maps and the output of the model in order to prune the redundant layers of the network. Extensive numerical experiments on both CIFAR-10, CIFAR-100, and Tiny ImageNet data sets demonstrate that the proposed method can be effective in compressing deep models, both in terms of the numbers of parameters and operations. For instance, by applying the proposed approach to DenseNet model with 0.77 million parameters and 293 million operations for classification of CIFAR-10 data set, a reduction of 62.66% and 41.00% in the number of parameters and the number of operations are respectively achieved, while increasing the test error only by less than 1%.

Original languageEnglish (US)
Title of host publicationProceedings of ICPR 2020 - 25th International Conference on Pattern Recognition
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages6988-6995
Number of pages8
ISBN (Electronic)9781728188089
DOIs
StatePublished - 2020
Event25th International Conference on Pattern Recognition, ICPR 2020 - Virtual, Milan, Italy
Duration: Jan 10 2021Jan 15 2021

Publication series

NameProceedings - International Conference on Pattern Recognition
ISSN (Print)1051-4651

Conference

Conference25th International Conference on Pattern Recognition, ICPR 2020
Country/TerritoryItaly
CityVirtual, Milan
Period1/10/211/15/21

Bibliographical note

Publisher Copyright:
© 2020 IEEE

Keywords

  • Deep neural compression
  • Feature maps
  • Mutual information

Fingerprint

Dive into the research topics of 'On the information of feature maps and pruning of deep neural networks'. Together they form a unique fingerprint.

Cite this