SEFACT: Selective feature activation and early classification for CNNs

Farhana Sharmin Snigdha, Ibrahim Ahmed, Susmita Dey Manasi, Meghna G. Mankalale, Jiang Hu, Sachin S Sapatnekar

Research output: Chapter in Book/Report/Conference proceedingConference contribution

3 Scopus citations

Abstract

This work presents a dynamic energy reduction approach for hardware accelerators for convolutional neural networks (CNN). Two methods are used: (1) an adaptive data-dependent scheme to selectively activate a subset of all neurons, by narrowing down the possible activated classes (2) static bitwidth reduction. The former is applied in late layers of the CNN, while the latter is more effective in early layers. Even accounting for the implementation overheads, the results show 20%-25% energy savings with 5-10% accuracy loss.

Original languageEnglish (US)
Title of host publicationASP-DAC 2019 - 24th Asia and South Pacific Design Automation Conference
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages526-531
Number of pages6
ISBN (Electronic)9781450360074
DOIs
StatePublished - Jan 21 2019
Event24th Asia and South Pacific Design Automation Conference, ASPDAC 2019 - Tokyo, Japan
Duration: Jan 21 2019Jan 24 2019

Publication series

NameProceedings of the Asia and South Pacific Design Automation Conference, ASP-DAC

Other

Other24th Asia and South Pacific Design Automation Conference, ASPDAC 2019
Country/TerritoryJapan
CityTokyo
Period1/21/191/24/19

Bibliographical note

Publisher Copyright:
© 2019 Copyright is held by the owner/author(s). Publication rights licensed to ACM.

Fingerprint

Dive into the research topics of 'SEFACT: Selective feature activation and early classification for CNNs'. Together they form a unique fingerprint.

Cite this