Abstract
Recently, Deep Convolutional Neural Network (DCNN) has achieved tremendous success in many machine learning applications. Nevertheless, the deep structure has brought significant increases in computation complexity. Largescale deep learning systems mainly operate in high-performance server clusters, thus restricting the application extensions to personal or mobile devices. Previous works on GPU and/or FPGA acceleration for DCNNs show increasing speedup, but ignore other constraints, such as area, power, and energy. Stochastic Computing (SC), as a unique data representation and processing technique, has the potential to enable the design of fully parallel and scalable hardware implementations of large-scale deep learning systems. This paper proposed an automatic design allocation algorithm driven by budget requirement considering overall accuracy performance. This systematic method enables the automatic design of a DCNN where all design parameters are jointly optimized. Experimental results demonstrate that proposed algorithm can achieve a joint optimization of all design parameters given the comprehensive budget of a DCNN.
Original language | English (US) |
---|---|
Title of host publication | Proceedings - 2018 IEEE Computer Society Annual Symposium on VLSI, ISVLSI 2018 |
Publisher | IEEE Computer Society |
Pages | 28-33 |
Number of pages | 6 |
ISBN (Print) | 9781538670996 |
DOIs | |
State | Published - Aug 7 2018 |
Externally published | Yes |
Event | 17th IEEE Computer Society Annual Symposium on VLSI, ISVLSI 2018 - Hong Kong, Hong Kong Duration: Jul 9 2018 → Jul 11 2018 |
Publication series
Name | Proceedings of IEEE Computer Society Annual Symposium on VLSI, ISVLSI |
---|---|
Volume | 2018-July |
ISSN (Print) | 2159-3469 |
ISSN (Electronic) | 2159-3477 |
Other
Other | 17th IEEE Computer Society Annual Symposium on VLSI, ISVLSI 2018 |
---|---|
Country/Territory | Hong Kong |
City | Hong Kong |
Period | 7/9/18 → 7/11/18 |
Bibliographical note
Publisher Copyright:© 2018 IEEE.
Keywords
- Deep Convolutional Neural Networks
- Deep Learning
- Design Parameter Co optimization
- Stochastic Computing