Abstract
We propose a new structured pruning framework for compressing Deep Neural Networks (DNNs) with skip-connections, based on measuring the statistical dependency of hidden layers and predicted outputs. The dependence measure defined by the energy statistics of hidden layers serves as a model-free measure of information between the feature maps and the output of the network. The estimated dependence measure is subsequently used to prune a collection of redundant and uninformative layers. Extensive numerical experiments on various architectures show the efficacy of the proposed pruning approach with competitive performance to state-of-the-art methods.
Original language | English (US) |
---|---|
Title of host publication | Proceedings - DCC 2022 |
Subtitle of host publication | 2022 Data Compression Conference |
Editors | Ali Bilgin, Michael W. Marcellin, Joan Serra-Sagrista, James A. Storer |
Publisher | Institute of Electrical and Electronics Engineers Inc. |
Pages | 482 |
Number of pages | 1 |
ISBN (Electronic) | 9781665478939 |
DOIs | |
State | Published - 2022 |
Event | 2022 Data Compression Conference, DCC 2022 - Snowbird, United States Duration: Mar 22 2022 → Mar 25 2022 |
Publication series
Name | 2022 Data Compression Conference (DCC) |
---|
Conference
Conference | 2022 Data Compression Conference, DCC 2022 |
---|---|
Country/Territory | United States |
City | Snowbird |
Period | 3/22/22 → 3/25/22 |
Bibliographical note
Funding Information:This work has been supported in part by the Army Research Office grant No. W911NF-15-1-0479.
Publisher Copyright:
© 2022 IEEE.