TY - GEN
T1 - Coverage optimized active learning for k - NN classifiers
AU - Joshi, Ajay J.
AU - Porikli, Fatih
AU - Papanikolopoulos, Nikolaos P
PY - 2012
Y1 - 2012
N2 - Fast image recognition and classification is extremely important in various robotics applications such as exploration, rescue, localization, etc. k-nearest neighbor (kNN) classifiers are popular tools used in classification since they involve no explicit training phase, and are simple to implement. However, they often require large amounts of training data to work well in practice. In this paper, we propose a batch-mode active learning algorithm for efficient training of kNN classifiers, that substantially reduces the amount of training required. As opposed to much previous work on iterative single-sample active selection, the proposed system selects samples in batches. We propose a coverage formulation that enforces selected samples to be distributed such that all data points have labeled samples at a bounded maximum distance, given the training budget, so that there are labeled neighbors in a small neighborhood of each point. Using submodular function optimization, the proposed algorithm presents a near-optimal selection strategy for an otherwise intractable problem. Further we employ uncertainty sampling along with coverage to incorporate model information and improve classification. Finally, we use locality sensitive hashing for fast retrieval of nearest neighbors during active selection as well as classification, which provides 1-2 orders of magnitude speedups thus allowing real-time classification with large datasets.
AB - Fast image recognition and classification is extremely important in various robotics applications such as exploration, rescue, localization, etc. k-nearest neighbor (kNN) classifiers are popular tools used in classification since they involve no explicit training phase, and are simple to implement. However, they often require large amounts of training data to work well in practice. In this paper, we propose a batch-mode active learning algorithm for efficient training of kNN classifiers, that substantially reduces the amount of training required. As opposed to much previous work on iterative single-sample active selection, the proposed system selects samples in batches. We propose a coverage formulation that enforces selected samples to be distributed such that all data points have labeled samples at a bounded maximum distance, given the training budget, so that there are labeled neighbors in a small neighborhood of each point. Using submodular function optimization, the proposed algorithm presents a near-optimal selection strategy for an otherwise intractable problem. Further we employ uncertainty sampling along with coverage to incorporate model information and improve classification. Finally, we use locality sensitive hashing for fast retrieval of nearest neighbors during active selection as well as classification, which provides 1-2 orders of magnitude speedups thus allowing real-time classification with large datasets.
UR - http://www.scopus.com/inward/record.url?scp=84864428377&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84864428377&partnerID=8YFLogxK
U2 - 10.1109/ICRA.2012.6225054
DO - 10.1109/ICRA.2012.6225054
M3 - Conference contribution
AN - SCOPUS:84864428377
SN - 9781467314039
T3 - Proceedings - IEEE International Conference on Robotics and Automation
SP - 5353
EP - 5358
BT - 2012 IEEE International Conference on Robotics and Automation, ICRA 2012
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2012 IEEE International Conference on Robotics and Automation, ICRA 2012
Y2 - 14 May 2012 through 18 May 2012
ER -