Abstract
The structure of most neural networks is based on node models with sigmoid-type activation functions. This paper presents a neural network structure whose nodes reflect a Gaussian-type activation function. Three learning algorithms are introduced and compared; the Gaussian perceptron learning (GPL) algorithm, which is based on the conventional perceptron convergence procedure; the least-squares error (LSE) algorithm, which follows the classical steepest descent approach; and the least-log-squares error (LLSE) algorithm, which is a gradient method on a log objective function. In particular, the convergence of the GPL algorithm is proved. The performance on each algorithm is demonstrated through benchmark problems.
Original language | English (US) |
---|---|
Title of host publication | 1992 IEEE International Conference on Systems, Man, and Cybernetics |
Subtitle of host publication | Emergent Innovations in Information Transfer Processing and Decision Making, SMC 1992 |
Publisher | Institute of Electrical and Electronics Engineers Inc. |
Pages | 105-110 |
Number of pages | 6 |
ISBN (Electronic) | 0780307208, 9780780307209 |
DOIs | |
State | Published - 1992 |
Event | IEEE International Conference on Systems, Man, and Cybernetics, SMC 1992 - Chicago, United States Duration: Oct 18 1992 → Oct 21 1992 |
Publication series
Name | Conference Proceedings - IEEE International Conference on Systems, Man and Cybernetics |
---|---|
Volume | 1992-January |
ISSN (Print) | 1062-922X |
Other
Other | IEEE International Conference on Systems, Man, and Cybernetics, SMC 1992 |
---|---|
Country/Territory | United States |
City | Chicago |
Period | 10/18/92 → 10/21/92 |
Bibliographical note
Publisher Copyright:© 1992 IEEE.