Abstract
The search for Artificial Neural Networks (ANNs) that are effective in solving a particular task is a long and time consuming trial-and-error process where we have to make decisions about the topology of the network, learning algorithm, and numerical parameters. To ease this process, we can resort to methods that seek to automatically optimise either the topology or simultaneously the topology and learning parameters of ANNs. The main issue of such approaches is that they require large amounts of computational resources, and take a long time to generate a solution that is considered acceptable for the problem at hand. The current paper extends Deep Evolutionary Network Structured Representation (DENSER): a general-purpose NeuroEvolution (NE) approach that combines the principles of Genetic Algorithms with Grammatical Evolution; to adapt DENSER to optimise networks of different structures, or to solve various problems the user only needs to change the grammar that is specified in a text human-readable format. The new method, Fast DENSER (F-DENSER), speeds up DENSER, and adds another representation-level that allows the connectivity of the layers to be evolved. The results demonstrate that F-DENSER has a speedup of 20 times when compared to the time DENSER takes to find the best solutions. Concerning the effectiveness of the approach, the results are highly competitive with the state-of-the-art, with the best performing network reporting an average test accuracy of 91.46% on CIFAR-10. This is particularly remarkable since the reduction in the running time does not compromise the performance of the generated solutions.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
F-DENSER and DENSER are run in the same machines, which have the following specifications: 1080 Ti GPUs, 64 GB of RAM, and an Intel Core i7-6850K CPU.
References
Assunção, F., Lourenço, N., Machado, P., Ribeiro, B.: Evolving the topology of large scale deep neural networks. In: Castelli, M., Sekanina, L., Zhang, M., Cagnoni, S., García-Sánchez, P. (eds.) Genetic Programming, pp. 19–34. Springer International Publishing, Cham (2018). https://doi.org/10.1007/978-3-319-77553-1_2
Assunção, F., Lourenço, N., Machado, P., Ribeiro, B.: DENSER: deep evolutionary network structured representation. Genet. Program. Evolvable Mach. (2018). https://doi.org/10.1007/s10710-018-9339-y
Guyon, I., et al.: A brief review of the ChaLearn AutoML challenge: any-time any-dataset learning without human intervention. In: AutoML@ICML. JMLR Workshop and Conference Proceedings, vol. 64, pp. 21–30 (2016)
Duan, K.-B., Keerthi, S.S.: Which is the best multiclass SVM method? An empirical study. In: Oza, N.C., Polikar, R., Kittler, J., Roli, F. (eds.) MCS 2005. LNCS, vol. 3541, pp. 278–285. Springer, Heidelberg (2005). https://doi.org/10.1007/11494683_28
Bergstra, J., Bengio, Y.: Random search for hyper-parameter optimization. J. Mach. Learn. Res. 13, 281–305 (2012)
Bergstra, J., Yamins, D., Cox, D.D.: Making a science of model search: Hyperparameter optimization in hundreds of dimensions for vision architectures. In: JMLR Workshop and Conference Proceedings, ICML (1), vol. 28, pp. 115–123 (2013)
Miikkulainen, R., et al.: Evolving deep neural networks. CoRR abs/1703.00548 (2017)
Floreano, D., Dürr, P., Mattiussi, C.: Neuroevolution: from architectures to learning. Evol. Intell. 1(1), 47–62 (2008)
Koutník, J., Cuccu, G., Schmidhuber, J., Gomez, F.J.: Evolving large-scale neural networks for vision-based TORCS. In: FDG, pp. 206–212. Society for the Advancement of the Science of Digital Games (2013)
Gomez, F.J., Schmidhuber, J., Miikkulainen, R.: Accelerated neural evolution through cooperatively coevolved synapses. J. Mach. Learn. Res. 9, 937–965 (2008)
Turner, A.J., Miller, J.F.: The importance of topology evolution in neuroevolution: a case study using cartesian genetic programming of artificial neural networks. In: Bramer, M., Petridis, M. (eds.) Research and Development in Intelligent Systems XXX, pp. 213–226. Springer, Cham (2013). https://doi.org/10.1007/978-3-319-02621-3_15
Stanley, K.O., Miikkulainen, R.: Evolving neural networks through augmenting topologies. Evol. Comput. 10(2), 99–127 (2002)
Suganuma, M., Shirakawa, S., Nagao, T.: A genetic programming approach to designing convolutional neural network architectures. In: Proceedings of the Genetic and Evolutionary Computation Conference, pp. 497–504. ACM (2017)
Real, E., Aggarwal, A., Huang, Y., Le, Q.V.: Regularized evolution for image classifier architecture search. arXiv preprint arXiv:1802.01548 (2018)
Lourenço, N., Assunção, F., Pereira, F.B., Costa, E., Machado, P.: Structured grammatical evolution: a dynamic approach. In: Ryan, C., O’Neill, M., Collins, J. (eds.) Handbook of Grammatical Evolution, pp. 137–161. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-78717-6_6
Krizhevsky, A., Hinton, G.: Learning multiple layers of features from tiny images (2009)
Acknowledgments
This work is partially funded by: Fundação para a Ciência e Tecnologia (FCT), Portugal, under the PhD grant SFRH/BD/114865/2016, and the project grant DSAIPA/DS/0022/2018 (GADgET), and is based upon work from COST Action CA15140: ImAppNIO, supported by COST (European Cooperation in Science and Technology): www.cost.eu.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer Nature Switzerland AG
About this paper
Cite this paper
Assunção, F., Lourenço, N., Machado, P., Ribeiro, B. (2019). Fast DENSER: Efficient Deep NeuroEvolution. In: Sekanina, L., Hu, T., Lourenço, N., Richter, H., García-Sánchez, P. (eds) Genetic Programming. EuroGP 2019. Lecture Notes in Computer Science(), vol 11451. Springer, Cham. https://doi.org/10.1007/978-3-030-16670-0_13
Download citation
DOI: https://doi.org/10.1007/978-3-030-16670-0_13
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-16669-4
Online ISBN: 978-3-030-16670-0
eBook Packages: Computer ScienceComputer Science (R0)