Abstract
The typical automatic way to search for optimal neural network is to combine structure evolution by evolutionary computation and weight adaptation by backpropagation. In this model, since structure and weight optimizations are carried out by two different algorithms each using its own search space, every change in network topology during structure evolution requires relearning of the entire weights by backpropagation. Because of this inefficiency, we propose that the evolution of network structure and weights shall be purely stochastic and tightly integrated such that good weights and structures are not relearned but propagated from generation to generation. Since this model does not depend on gradient information, the entire process allows more flexibility in the implementation of its evolution and in the formulation of its fitness function. This study demonstrates how invasive connectionist evolution can easily be implemented using particle swarm optimization (PSO), evolutionary programming (EP), and differential evolution (DE) with good performances in cancer and glass classification tasks.
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Yao, X.: Evolving artificial neural networks. Proceedings of the IEEE 87, 1423–1447 (1999)
Mitchell, M.: An Introduction to Genetic Algorithms. MIT Press, Cambridge (1998)
Tan, K.C., Lim, M., Yao, X., Wang, L. (eds.): Recent Advances in Simulated Evolution and Learning. World Scientific, Singapore (2004)
Palmes, P., Hayasaka, T., Usui, S.: Mutation-based genetic neural network. IEEE Transactions on Neural Network 16 (2005)
Eberhart, R.C., Kennedy, J.: A new optimizer using particle swarm theory. In: Proceedings of the Sixth International Symposium on Micromachine and Human Science, Nagoya, Japan, pp. 39–43 (1995)
Storn, R., Price, K.: Differential evolution - a simple and efficient heuristic for global optimization over continuous spaces. Journal of Global Optimization 11, 341–359 (1997)
Fogel, D.: Evolutionary Computation. Toward a New Philosophy of Machine Intelligence. IEEE Press, Piscataway (1995)
Wolpert, D.H., Macready, W.G.: No free lunch theorems for optimization. IEEE Transactions on Evolutionary Computation 1, 67–82 (1997)
Kennedy, J., Eberhart, R.C.: Particle swarm optimization. In: Proceedings of IEEE International Conference on Neural Network, Piscataway, NJ, vol. 8, pp. 1942–1948 (1995)
Palmes, P., Hayasaka, T., Usui, S.: Evolution and adaptation of neural networks. In: Proceedings of the International Joint Conference on Neural Networks, IJCNN, Portland, Oregon, USA, vol. II, pp. 397–404. IEEE Computer Society Press, Los Alamitos (2003)
Palmes, P., Hayasaka, T., Usui, S.: Sepa: Structure evolution and parameter adaptation. In: Paz, E.C. (ed.) Proceedings of the Genetic and Evolutionary Computation Conference, Chicago, Illinois, USA, vol. 2, p. 223. Morgan Kaufmann, San Francisco (2003)
Murphy, P.M., Aha, D.W.: UCI Repository of machine learning databases. University of California, Department of Information and Computer Science, Irvine, CA (1994)
Prechelt, L.: Proben1–a set of neural network benchmark problems and benchmarking. Technical report, Fakultat fur Informatik, Univ. Karlsruhe, Karlsruhe, Germany (1994)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2005 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Palmes, P.P., Usui, S. (2005). Invasive Connectionist Evolution. In: Wang, L., Chen, K., Ong, Y.S. (eds) Advances in Natural Computation. ICNC 2005. Lecture Notes in Computer Science, vol 3612. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11539902_141
Download citation
DOI: https://doi.org/10.1007/11539902_141
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-28320-1
Online ISBN: 978-3-540-31863-7
eBook Packages: Computer ScienceComputer Science (R0)