TY - GEN
T1 - On-Device Event Filtering with Binary Neural Networks for Pedestrian Detection Using Neuromorphic Vision Sensors
AU - Ojeda, Fernando Cladera
AU - Bisulco, Anthony
AU - Kepple, Daniel
AU - Isler, Volkan
AU - Lee, Daniel D.
N1 - Publisher Copyright:
© 2020 IEEE.
Copyright:
Copyright 2020 Elsevier B.V., All rights reserved.
PY - 2020/10
Y1 - 2020/10
N2 - In this work, we present a hardware-efficient architecture for pedestrian detection with neuromorphic Dynamic Vision Sensors (DVSs), asynchronous camera sensors that report discrete changes in light intensity. These imaging sensors have many advantages compared to traditional frame-based cameras, such as increased dynamic range, lower bandwidth requirements, and higher sampling frequency with lower power consumption. Our architecture is composed of two main components: an event filtering stage to denoise the input image stream followed by a low-complexity neural network. For the first stage, we use a novel point-process filter (PPF) with an adaptive temporal windowing scheme that enhances classification accuracy. The second stage implements a hardware-efficient Binary Neural Network (BNN) for classification. To demonstrate the reduction in complexity achieved by our architecture, we showcase a Field-Programmable Gate Array (FPGA) implementation of the entire system which obtains a 86 reduction in latency compared to current neural network floating-point architectures.
AB - In this work, we present a hardware-efficient architecture for pedestrian detection with neuromorphic Dynamic Vision Sensors (DVSs), asynchronous camera sensors that report discrete changes in light intensity. These imaging sensors have many advantages compared to traditional frame-based cameras, such as increased dynamic range, lower bandwidth requirements, and higher sampling frequency with lower power consumption. Our architecture is composed of two main components: an event filtering stage to denoise the input image stream followed by a low-complexity neural network. For the first stage, we use a novel point-process filter (PPF) with an adaptive temporal windowing scheme that enhances classification accuracy. The second stage implements a hardware-efficient Binary Neural Network (BNN) for classification. To demonstrate the reduction in complexity achieved by our architecture, we showcase a Field-Programmable Gate Array (FPGA) implementation of the entire system which obtains a 86 reduction in latency compared to current neural network floating-point architectures.
KW - binary neural networks
KW - dynamic vision sensors
KW - embedded systems
KW - FPGA
KW - pedestrian detection
UR - http://www.scopus.com/inward/record.url?scp=85098664567&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85098664567&partnerID=8YFLogxK
U2 - 10.1109/ICIP40778.2020.9191148
DO - 10.1109/ICIP40778.2020.9191148
M3 - Conference contribution
AN - SCOPUS:85098664567
T3 - Proceedings - International Conference on Image Processing, ICIP
SP - 3084
EP - 3088
BT - 2020 IEEE International Conference on Image Processing, ICIP 2020 - Proceedings
PB - IEEE Computer Society
T2 - 2020 IEEE International Conference on Image Processing, ICIP 2020
Y2 - 25 September 2020 through 28 September 2020
ER -