TY - JOUR
T1 - Neural network classifiers using a hardware-based approximate activation function with a hybrid stochastic multiplier
AU - Li, Bingzhe
AU - Qin, Yaobin
AU - Yuan, Bo
AU - Lilja, David J.
N1 - Publisher Copyright:
© 2019 Association for Computing Machinery.
PY - 2019/1
Y1 - 2019/1
N2 - Neural networks are becoming prevalent in many areas, such as pattern recognition and medical diagnosis. Stochastic computing is one potential solution for neural networks implemented in low-power back-end devices such as solar-powered devices and Internet of Things (IoT) devices. In this article, we investigate a new architecture of stochastic neural networks with a hardware-oriented approximate activation function. The newly proposed approximate activation function can be hidden in the proposed architecture and thus reduce the whole hardware cost. Additionally, to further reduce the hardware cost of the stochastic implementation, a new hybrid stochastic multiplier is proposed. It contains OR gates and a binary parallel counter, which aims to reduce the number of inputs of the binary parallel counter. The experimental results indicate the newly proposed approximate architecture without hybrid stochastic multipliers achieves more than 25%, 60%, and 3x reduction compared to previous stochastic neural networks, and more than 30x, 30x, and 52% reduction compared to conventional binary neural networks, in terms of area, power, and energy, respectively, while maintaining the similar error rates compared to the conventional neural networks. Furthermore, the stochastic implementation with hybrid stochastic multipliers further reduces area about 18% to 80%, power from 15% to 113.1%, and energy about 15% to 131%, respectively.
AB - Neural networks are becoming prevalent in many areas, such as pattern recognition and medical diagnosis. Stochastic computing is one potential solution for neural networks implemented in low-power back-end devices such as solar-powered devices and Internet of Things (IoT) devices. In this article, we investigate a new architecture of stochastic neural networks with a hardware-oriented approximate activation function. The newly proposed approximate activation function can be hidden in the proposed architecture and thus reduce the whole hardware cost. Additionally, to further reduce the hardware cost of the stochastic implementation, a new hybrid stochastic multiplier is proposed. It contains OR gates and a binary parallel counter, which aims to reduce the number of inputs of the binary parallel counter. The experimental results indicate the newly proposed approximate architecture without hybrid stochastic multipliers achieves more than 25%, 60%, and 3x reduction compared to previous stochastic neural networks, and more than 30x, 30x, and 52% reduction compared to conventional binary neural networks, in terms of area, power, and energy, respectively, while maintaining the similar error rates compared to the conventional neural networks. Furthermore, the stochastic implementation with hybrid stochastic multipliers further reduces area about 18% to 80%, power from 15% to 113.1%, and energy about 15% to 131%, respectively.
KW - Approximate activation function
KW - Neural networks
KW - Stochastic computing
UR - https://www.scopus.com/pages/publications/85061088716
UR - https://www.scopus.com/pages/publications/85061088716#tab=citedBy
U2 - 10.1145/3284933
DO - 10.1145/3284933
M3 - Article
AN - SCOPUS:85061088716
SN - 1550-4832
VL - 15
JO - ACM Journal on Emerging Technologies in Computing Systems
JF - ACM Journal on Emerging Technologies in Computing Systems
IS - 1
M1 - 12
ER -