TY - GEN
T1 - Algorithms for parallel boosting
AU - Lozano, Fernando
AU - Rangel, Pedro
PY - 2005
Y1 - 2005
N2 - We present several algorithms that combine many base learners trained on different distributions of the data, but allow some of the base learners to be trained simultaneously by separate processors. Our algorithms train batches of base classifiers using distributions that can be generated in advance of the training process. We propose several heuristic methods that produce a group of useful distributions based on the performance of the classifiers in the previous batch. We present experimental evidence that suggest that two of our algorithms are able to produce classifiers as accurate as the corresponding Adaboost classifier with the same number of base learners, but with a greatly reduced computation time.
AB - We present several algorithms that combine many base learners trained on different distributions of the data, but allow some of the base learners to be trained simultaneously by separate processors. Our algorithms train batches of base classifiers using distributions that can be generated in advance of the training process. We propose several heuristic methods that produce a group of useful distributions based on the performance of the classifiers in the previous batch. We present experimental evidence that suggest that two of our algorithms are able to produce classifiers as accurate as the corresponding Adaboost classifier with the same number of base learners, but with a greatly reduced computation time.
UR - http://www.scopus.com/inward/record.url?scp=33847285070&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=33847285070&partnerID=8YFLogxK
U2 - 10.1109/ICMLA.2005.8
DO - 10.1109/ICMLA.2005.8
M3 - Conference contribution
AN - SCOPUS:33847285070
SN - 0769524958
SN - 9780769524955
T3 - Proceedings - ICMLA 2005: Fourth International Conference on Machine Learning and Applications
SP - 368
EP - 373
BT - Proceedings - ICMLA 2005
T2 - ICMLA 2005: 4th International Conference on Machine Learning and Applications
Y2 - 15 December 2005 through 17 December 2005
ER -