Abstract
In this paper, we study the incorporation of Bayesian regularization into constructive neural networks. The degree of regularization is automatically controlled in the Bayesian inference framework and hence does not require manual setting. Simulation shows that regularization, with input training using a full Bayesian approach, produces networks with better generalization performance and lower susceptibility to over-fitting as the network size increases. Regularization with input training under MacKay's evidence framework, however, does not produce significant improvement on the problems tested.
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
W.L. Buntine and A.S. Weigend. Bayesian back-propagation. Complex Systems, 5:603–643, 1991.
G.P. Drago and S. Ridella. Convergence properties of cascade correlation in function approximation. Neural Computing & Applications, 2:142–147, 1994.
S.E. Fahlman and C. Lebiere. The cascade-correlation learning architecture. In D.S. Touretzky, editor, Advances in Neural Information Processing Systems 2, pages 524–532. Morgan Kaufmann, Los Altos CA, 1990.
E. Fiesler. Comparative bibliography of ontogenic neural networks. In Proceedings of the International Conference on Artificial Neural Networks, volume 1, pages 793–796, Sorrento, Italy, May 1994.
L.K. Hansen and M.W. Pedersen. Controlled growth of cascade correlation nets. In Proceedings of the International Conference on Artificial Neural Networks, volume 1, pages 797–800, Sorrento, Italy, May 1994.
J.N. Hwang, S.R. Lay, M. Maechler, D. Martin, and J. Schimert. Regression modeling in back-propagation and ojection pursuit learning. IEEE Transactions on Neural Networks, 5(3):342–353, May 1994.
V. Kurková and B. Beliczynski. Incremental approximation by one-hidden-layer neural networks. In Proceedings of the International Conference on Artificial Neural Networks, volume 1, pages 505–510, Paris, France, October 1995.
T.Y. Kwok and D.Y. Yeung. Objective functions for training new hidden units in constructive neural networks, 1995. Submitted.
D.J.C. MacKay. Bayesian interpolation. Neural Computation, 4(3):415–447, May 1992.
D.J.C. MacKay. A practical Bayesian framework for backpropagation networks. Neural Computation, 4(3):448–472, May 1992.
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 1996 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Kwok, TY., Yeung, DY. (1996). Bayesian regularization in constructive neural networks. In: von der Malsburg, C., von Seelen, W., Vorbrüggen, J.C., Sendhoff, B. (eds) Artificial Neural Networks — ICANN 96. ICANN 1996. Lecture Notes in Computer Science, vol 1112. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-61510-5_95
Download citation
DOI: https://doi.org/10.1007/3-540-61510-5_95
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-61510-1
Online ISBN: 978-3-540-68684-2
eBook Packages: Springer Book Archive