Regularization effect of weight initialization in back propagation networks

Vladimir Cherkassky, Robert Shepherd

Research output: Contribution to conferencePaperpeer-review

6 Scopus citations

Abstract

Complexity control of a learning method is critical for obtaining good generalization with finite training data. We discuss complexity control in multilayer perceptron (MLP) networks trained via backpropagation. For such networks, the number of hidden units and/or network weights is usually used as a complexity parameter. However, application of backpropagation training introduces additional mechanisms for complexity control. These mechanisms are implicit in the implementation of an optimization procedure, and they cannot be easily quantified (in contrast to the number of weights or the number of hidden units). We suggest using the framework of Statistical Learning Theory to explain the effect of weight initialization. Using this framework, we demonstrate the effect of weight initialization on complexity control in MLP networks.

Original languageEnglish (US)
Pages2258-2261
Number of pages4
StatePublished - Jan 1 1998
EventProceedings of the 1998 IEEE International Joint Conference on Neural Networks. Part 1 (of 3) - Anchorage, AK, USA
Duration: May 4 1998May 9 1998

Other

OtherProceedings of the 1998 IEEE International Joint Conference on Neural Networks. Part 1 (of 3)
CityAnchorage, AK, USA
Period5/4/985/9/98

Fingerprint Dive into the research topics of 'Regularization effect of weight initialization in back propagation networks'. Together they form a unique fingerprint.

Cite this