TY - GEN
T1 - Pedestrian detection by multi-spectral fusion
AU - Ma, Yunqian
AU - Wang, Zheng
AU - Bazakos, Mike
PY - 2006
Y1 - 2006
N2 - Security systems increasingly rely on the use of Automated Video Surveillance (AVS) technology. In particular the use of digital video renders itself to internet and local communications, remote monitoring, and to computer processing. AVS systems can perform many tedious and repetitive tasks currently performed by trained security personnel. AVS technology has already made some significant steps towards automating some basic security functions such as: motion detection, object tracking and event-based video recording. However, there are still many problems associated with just these automated functions, which need to be addressed further. Some examples of these problems are: the high "false alarm rate" and the "loss of track" under total or partial occlusion, when used under a wide range of operational parameters (day, night, sunshine, cloudy, foggy, range, viewing angle, clutter, etc.). Current surveillance systems work well only under a narrow range of operational parameters. Therefore, they need be hardened against a wide range of operational conditions. In this paper, we present a Multi-spectral fusion approach to perform accurate pedestrian segmentation under varying operational parameters. Our fusion method combines the "best" detection results from the visible images and the "best" from the thermal images. Commonly, the motion detection results in the visible images are easily affected by noise and shadows. The objects in the thermal image are relatively stable, but they may be missing some parts of the objects, because they thermally blend with the background. Our method makes use of the "best" object components and de-emphasize the "not best".
AB - Security systems increasingly rely on the use of Automated Video Surveillance (AVS) technology. In particular the use of digital video renders itself to internet and local communications, remote monitoring, and to computer processing. AVS systems can perform many tedious and repetitive tasks currently performed by trained security personnel. AVS technology has already made some significant steps towards automating some basic security functions such as: motion detection, object tracking and event-based video recording. However, there are still many problems associated with just these automated functions, which need to be addressed further. Some examples of these problems are: the high "false alarm rate" and the "loss of track" under total or partial occlusion, when used under a wide range of operational parameters (day, night, sunshine, cloudy, foggy, range, viewing angle, clutter, etc.). Current surveillance systems work well only under a narrow range of operational parameters. Therefore, they need be hardened against a wide range of operational conditions. In this paper, we present a Multi-spectral fusion approach to perform accurate pedestrian segmentation under varying operational parameters. Our fusion method combines the "best" detection results from the visible images and the "best" from the thermal images. Commonly, the motion detection results in the visible images are easily affected by noise and shadows. The objects in the thermal image are relatively stable, but they may be missing some parts of the objects, because they thermally blend with the background. Our method makes use of the "best" object components and de-emphasize the "not best".
KW - Level Set
KW - Pedestrian detection
KW - Sensor Fusion
KW - Surveillance
UR - http://www.scopus.com/inward/record.url?scp=33747352547&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=33747352547&partnerID=8YFLogxK
U2 - 10.1117/12.668131
DO - 10.1117/12.668131
M3 - Conference contribution
AN - SCOPUS:33747352547
SN - 0819462985
SN - 9780819462985
T3 - Proceedings of SPIE - The International Society for Optical Engineering
BT - Multisensor, Multisource Information Fusion
T2 - Multisensor, Multisource Information Fusion: Architectures, Algorithms, and Applications 2006
Y2 - 19 April 2006 through 20 April 2006
ER -