[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ Skip to main content

Advertisement

Log in

Advances in evolutionary feature selection neural networks with co-evolution learning

  • Original Article
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

Training neural networks is a complex task provided that many algorithms are combined to find best solutions to the classification problem. In this work, we point out the evolutionary computing to minimize a neural configuration. For this purpose, a distribution estimation framework is performed to select relevant features, which lead to classification accuracy with a lower complexity in computational time. Primarily, a pruning strategy-based score function is applied to decide the network relevance in the genetic population. Since the complexity of the network (connections, weights, and biases) is most important, the cooling state of the system will strongly relate to the entropy as a minimization function to reach the desired solution. Also, the framework proposes coevolution learning (with discrete and continuous representations) to improve the behavior of the evolutionary neural learning. The results obtained after simulations show that the proposed work is a promising way to extend its usability to other classes of neural networks.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
£29.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price includes VAT (United Kingdom)

Instant access to the full article PDF.

Fig. 1
Fig. 2

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

References

  1. Abbass HA (2002) An evolutionary artificial neural networks approach for breast cancer diagnosis. Artif Intell Med 25(3):265–281

    Article  Google Scholar 

  2. Abraham A, Nath B (2001) An adaptive learning framework for optimizing artificial neural networks. Lect Notes Comput Sci 2074

  3. Attik M, Bougrain L, Alexandre F (2004) Optimal brain surgeon variants for feature selection. Int Joint Conf Neural Netw 2:1371–1374

    Google Scholar 

  4. Bäck T, Fogel DB, Michalewicz Z (2000) Evolutionary computation 2: advanced algorithms and operators. Institute of physics publishing

  5. Beyer HG, Deb K (2001) On self-adaptive features in real-parameter evolutionary algorithms. IEEE Trans Evol Comput 5(3):250–270

    Article  Google Scholar 

  6. Cantù-Paz E (2003) Pruning neural networks with distribution estimation algorithms. In: Genetic and evolutionary conference GECCO 2003. Springer, Heidelberg, pp 790–800

  7. Cibas T, Soulié FF, Gallinari P, Raudys S (1996) Variable selection with neural networks. Neurocomputing 12(2–3):223–248

    Article  MATH  Google Scholar 

  8. Fogel DB (2002) Evolutionary computing. IEEE Press

  9. Gepperth A, Roth S (2005) Applications of multi-objective structure optimization. In: Thirteenth European Symposium on Artificial Neural Networks

  10. Hancock PJB (1992) Pruning neural nets by genetic algorithm. In: Proceedings of the international conference on artificial neural networks, pp 991–994

  11. Hassibi B, Stork DG (1993) Second order derivatives for network pruning: optimal brain surgeon. Adv Neural Inf Process Syst 5:164–171

    Google Scholar 

  12. Hüsken MH, Jin Y, Sendho B (2005) Structure optimization of neural networks for aero-dynamic optimization. Soft Comput 9(1):21–28

    Article  Google Scholar 

  13. Hiroshi T, Hisaaki H (2003) The functional localization of neural networks using genetic algorithms. Neural Netw 16:55–67

    Article  Google Scholar 

  14. Larrañaga P (2002) A review on estimation of distribution algorithms. In: Larrañaga P, Lozano JA (eds) Estimation of distribution algorithms, Chap. 3. Kluwer Academic, Dordrecht, pp 57–100

  15. Le Cun Y, Dencker JS, Solla SA (1990) Optimal brain damage. Adv Neural Inf Process Syst 2:598–605

    Google Scholar 

  16. Liu H, Motoda H (1998) Feature selection from knowledge discovery and data mining, Kluwer Academic, Dordrecht

  17. Messer K, Kittler J (1998) Choosing an optimal network size to aid a search through a large image dataset. Brit Mach Vis Conf 1:235–244

    Google Scholar 

  18. Mohamed Ben Ali Y, Laskri MT (2005) An evolutionary machine learning: an adaptability perspective at fine granularity. In: International journal of knowledge-based and intelligent engineering systems, pp 13–20

  19. Mühlenbein H, Pâaß G (1996) From recombination of genes to the estimation of distributions I. In: Binary parameters, pp 178–187

  20. Newman DJ, Hettich S, Blake CL, Merz CJ (1998) UCI Repository of machine learning databases

  21. Pedersen MW, Hansen LK, Larsen J (1996) Pruning with generalization based weight saliencies: γ OBD, γ OBS. Advances Neural Inf Process Syst 8:521–527

    Google Scholar 

  22. Pelikan M, Goldberg DE, Cantù-Paz E (1999) BOA: the Bayesian optimization algorithm. In: Proceedings of the genetic and evolutionary computation conference. Morgan Kaufmann, San Francisco, pp 525–532

  23. Simi J, Orponen P, Antti-Poika S (2000) A computational taxonomy and survey of neural network models. Neural Comput 12(12):2965–2989

    Article  Google Scholar 

  24. Stanley KO, Miikkulainen R (2004) Competitive coevolution through evolutionary complexification. J Artif Intell Res 21:63–100

    Google Scholar 

  25. Stahlberger A, Riedmiller M (1997) Fast network pruning and feature extraction using the unit-OBS algorithm. Adv Neural Inf Process Syst 9:655–661

    Google Scholar 

  26. Van de Laar P, Heskes T, Gielen S (1999) Partial retraining: a new approach to input relevance determination. Int J Neural Netw 9(1):75–85

    MATH  Google Scholar 

  27. Verikas A, Bacauskiene M (2002) Feature selection with neural networks. Pattern Recognit Lett 23(11):1323–1335

    Article  MATH  Google Scholar 

  28. Wikel JH, Dow ER (1993) The use of neural networks for variable selection in QSAR. Bioorg Med Chem Lett 3(4):645–651

    Article  Google Scholar 

  29. Wicker D, Rizki M, Tamburino LA (2002) E-Net: evolutionary neural network synthesis. Neurocomputing 42:171–196

    Article  MATH  Google Scholar 

  30. Yao X (1999) Evolving artificial neural networks. Proc IEEE 87(9):1423–1447

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yamina Mohamed Ben Ali.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Mohamed Ben Ali, Y. Advances in evolutionary feature selection neural networks with co-evolution learning. Neural Comput & Applic 17, 217–226 (2008). https://doi.org/10.1007/s00521-007-0114-x

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-007-0114-x

Keywords

Navigation