[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to main content

Fast DENSER: Efficient Deep NeuroEvolution

  • Conference paper
  • First Online:
Genetic Programming (EuroGP 2019)

Abstract

The search for Artificial Neural Networks (ANNs) that are effective in solving a particular task is a long and time consuming trial-and-error process where we have to make decisions about the topology of the network, learning algorithm, and numerical parameters. To ease this process, we can resort to methods that seek to automatically optimise either the topology or simultaneously the topology and learning parameters of ANNs. The main issue of such approaches is that they require large amounts of computational resources, and take a long time to generate a solution that is considered acceptable for the problem at hand. The current paper extends Deep Evolutionary Network Structured Representation (DENSER): a general-purpose NeuroEvolution (NE) approach that combines the principles of Genetic Algorithms with Grammatical Evolution; to adapt DENSER to optimise networks of different structures, or to solve various problems the user only needs to change the grammar that is specified in a text human-readable format. The new method, Fast DENSER (F-DENSER), speeds up DENSER, and adds another representation-level that allows the connectivity of the layers to be evolved. The results demonstrate that F-DENSER has a speedup of 20 times when compared to the time DENSER takes to find the best solutions. Concerning the effectiveness of the approach, the results are highly competitive with the state-of-the-art, with the best performing network reporting an average test accuracy of 91.46% on CIFAR-10. This is particularly remarkable since the reduction in the running time does not compromise the performance of the generated solutions.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
£29.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
GBP 19.95
Price includes VAT (United Kingdom)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
GBP 35.99
Price includes VAT (United Kingdom)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
GBP 44.99
Price includes VAT (United Kingdom)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    F-DENSER and DENSER are run in the same machines, which have the following specifications: 1080 Ti GPUs, 64 GB of RAM, and an Intel Core i7-6850K CPU.

References

  1. Assunção, F., Lourenço, N., Machado, P., Ribeiro, B.: Evolving the topology of large scale deep neural networks. In: Castelli, M., Sekanina, L., Zhang, M., Cagnoni, S., García-Sánchez, P. (eds.) Genetic Programming, pp. 19–34. Springer International Publishing, Cham (2018). https://doi.org/10.1007/978-3-319-77553-1_2

    Chapter  Google Scholar 

  2. Assunção, F., Lourenço, N., Machado, P., Ribeiro, B.: DENSER: deep evolutionary network structured representation. Genet. Program. Evolvable Mach. (2018). https://doi.org/10.1007/s10710-018-9339-y

  3. Guyon, I., et al.: A brief review of the ChaLearn AutoML challenge: any-time any-dataset learning without human intervention. In: AutoML@ICML. JMLR Workshop and Conference Proceedings, vol. 64, pp. 21–30 (2016)

    Google Scholar 

  4. Duan, K.-B., Keerthi, S.S.: Which is the best multiclass SVM method? An empirical study. In: Oza, N.C., Polikar, R., Kittler, J., Roli, F. (eds.) MCS 2005. LNCS, vol. 3541, pp. 278–285. Springer, Heidelberg (2005). https://doi.org/10.1007/11494683_28

    Chapter  Google Scholar 

  5. Bergstra, J., Bengio, Y.: Random search for hyper-parameter optimization. J. Mach. Learn. Res. 13, 281–305 (2012)

    MathSciNet  MATH  Google Scholar 

  6. Bergstra, J., Yamins, D., Cox, D.D.: Making a science of model search: Hyperparameter optimization in hundreds of dimensions for vision architectures. In: JMLR Workshop and Conference Proceedings, ICML (1), vol. 28, pp. 115–123 (2013)

    Google Scholar 

  7. Miikkulainen, R., et al.: Evolving deep neural networks. CoRR abs/1703.00548 (2017)

    Google Scholar 

  8. Floreano, D., Dürr, P., Mattiussi, C.: Neuroevolution: from architectures to learning. Evol. Intell. 1(1), 47–62 (2008)

    Article  Google Scholar 

  9. Koutník, J., Cuccu, G., Schmidhuber, J., Gomez, F.J.: Evolving large-scale neural networks for vision-based TORCS. In: FDG, pp. 206–212. Society for the Advancement of the Science of Digital Games (2013)

    Google Scholar 

  10. Gomez, F.J., Schmidhuber, J., Miikkulainen, R.: Accelerated neural evolution through cooperatively coevolved synapses. J. Mach. Learn. Res. 9, 937–965 (2008)

    MathSciNet  MATH  Google Scholar 

  11. Turner, A.J., Miller, J.F.: The importance of topology evolution in neuroevolution: a case study using cartesian genetic programming of artificial neural networks. In: Bramer, M., Petridis, M. (eds.) Research and Development in Intelligent Systems XXX, pp. 213–226. Springer, Cham (2013). https://doi.org/10.1007/978-3-319-02621-3_15

    Chapter  Google Scholar 

  12. Stanley, K.O., Miikkulainen, R.: Evolving neural networks through augmenting topologies. Evol. Comput. 10(2), 99–127 (2002)

    Article  Google Scholar 

  13. Suganuma, M., Shirakawa, S., Nagao, T.: A genetic programming approach to designing convolutional neural network architectures. In: Proceedings of the Genetic and Evolutionary Computation Conference, pp. 497–504. ACM (2017)

    Google Scholar 

  14. Real, E., Aggarwal, A., Huang, Y., Le, Q.V.: Regularized evolution for image classifier architecture search. arXiv preprint arXiv:1802.01548 (2018)

  15. Lourenço, N., Assunção, F., Pereira, F.B., Costa, E., Machado, P.: Structured grammatical evolution: a dynamic approach. In: Ryan, C., O’Neill, M., Collins, J. (eds.) Handbook of Grammatical Evolution, pp. 137–161. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-78717-6_6

    Chapter  Google Scholar 

  16. Krizhevsky, A., Hinton, G.: Learning multiple layers of features from tiny images (2009)

    Google Scholar 

Download references

Acknowledgments

This work is partially funded by: Fundação para a Ciência e Tecnologia (FCT), Portugal, under the PhD grant SFRH/BD/114865/2016, and the project grant DSAIPA/DS/0022/2018 (GADgET), and is based upon work from COST Action CA15140: ImAppNIO, supported by COST (European Cooperation in Science and Technology): www.cost.eu.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Filipe Assunção .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Assunção, F., Lourenço, N., Machado, P., Ribeiro, B. (2019). Fast DENSER: Efficient Deep NeuroEvolution. In: Sekanina, L., Hu, T., Lourenço, N., Richter, H., García-Sánchez, P. (eds) Genetic Programming. EuroGP 2019. Lecture Notes in Computer Science(), vol 11451. Springer, Cham. https://doi.org/10.1007/978-3-030-16670-0_13

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-16670-0_13

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-16669-4

  • Online ISBN: 978-3-030-16670-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics