On The Energy Statistics of Feature Maps in Pruning of Neural Networks with Skip-Connections

Mohammadreza Soltani, Suya Wu, Yuerong Li, Jie Ding, Vahid Tarokh, Mohammadreza Soltani, Suya Wu

Research output: Chapter in Book/Report/Conference proceedingConference contribution


We propose a new structured pruning framework for compressing Deep Neural Networks (DNNs) with skip-connections, based on measuring the statistical dependency of hidden layers and predicted outputs. The dependence measure defined by the energy statistics of hidden layers serves as a model-free measure of information between the feature maps and the output of the network. The estimated dependence measure is subsequently used to prune a collection of redundant and uninformative layers. Extensive numerical experiments on various architectures show the efficacy of the proposed pruning approach with competitive performance to state-of-the-art methods.

Original languageEnglish (US)
Title of host publicationProceedings - DCC 2022
Subtitle of host publication2022 Data Compression Conference
EditorsAli Bilgin, Michael W. Marcellin, Joan Serra-Sagrista, James A. Storer
PublisherInstitute of Electrical and Electronics Engineers Inc.
Number of pages1
ISBN (Electronic)9781665478939
StatePublished - 2022
Event2022 Data Compression Conference, DCC 2022 - Snowbird, United States
Duration: Mar 22 2022Mar 25 2022

Publication series

Name2022 Data Compression Conference (DCC)


Conference2022 Data Compression Conference, DCC 2022
Country/TerritoryUnited States

Bibliographical note

Funding Information:
This work has been supported in part by the Army Research Office grant No. W911NF-15-1-0479.

Publisher Copyright:
© 2022 IEEE.


Dive into the research topics of 'On The Energy Statistics of Feature Maps in Pruning of Neural Networks with Skip-Connections'. Together they form a unique fingerprint.

Cite this