[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ Skip to main content
Log in

Evolutionary Based Weight Decaying Method for Neural Network Training

  • Published:
Neural Processing Letters Aims and scope Submit manuscript

Abstract

A new weight decaying technique for neural network training is introduced. The proposed technique utilizes genetic algorithms in conjunction with a local optimization method to restrict the weights of the neural network in some range with desired generalization capabilities. This method is a global optimization one that overcomes most of the problems associated with local optimization procedures. In addition, this technique can be combined with any global optimization procedure from the relevant literature. The proposed technique has been evaluated on several well-known benchmark datasets, along with a series of classification and regression problems. The evaluation results are very promising indicating an improvement in classification error from 25 to 80% for the genetic algorithm.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
£29.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price includes VAT (United Kingdom)

Instant access to the full article PDF.

Fig. 1

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

References

  1. Bishop CM (1995) Neural networks for pattern recognition. Oxford university press

  2. Cybenko G (1989) Approximation by superpositions of a sigmoidal function. Math Control Signals Syst 2:303–314

    Article  MathSciNet  MATH  Google Scholar 

  3. Artyomov E, Yadid-Pecht EO (2005) Modified high-order neural network for invariant pattern recognition. Pattern Recognit Lett 26:843–851

    Article  Google Scholar 

  4. Uncini A (2003) Audio signal processing by neural networks. Neurocomputing 55:593–625

    Article  Google Scholar 

  5. Valdas JJ, Bonham-Carter G (2006) Time dependent neural network models for detecting changes of state in complex processes: applications in earth sciences and astronomy. Neural Netw 19:196–207

    Article  MATH  Google Scholar 

  6. Lagaris IE, Likas A, Fotiadis DI (1998) Artificial neural networks for solving ordinary and partial differential equations. IEEE Trans Neural Netw 9:987–1000

    Article  Google Scholar 

  7. Rumelhart DE, Hinton GE, Williams RJ (1986) Learning representations by back-propagating errors. Nature 323:533–536

    Article  MATH  Google Scholar 

  8. Riedmiller M, Braun H (1993) A direct adaptive method for faster backpropagation learning: the RPROP algorithm. In: Proceedings of the IEEE international conference on neural networks, San Francisco, CA, pp 586–591

  9. Yao X (1999) Evolving artificial neural networks. Proc IEEE 87(9):1423–1447

    Article  Google Scholar 

  10. Zhang C, Shao H, Li Y (2000) Particle swarm optimisation for evolving artificial neural network. IEEE Int Conf Syst Man Cybern 4:2487–2490

    Google Scholar 

  11. Geman S, Bienenstock E, Doursat R (1992) Neural networks and the bias/variance dilemma. Neural Comput 4:1–58

    Article  Google Scholar 

  12. Nowlan SJ, Hinton GE (1992) Simplifying neural networks by soft weight sharing. Neural Comput 4:473–493

    Article  Google Scholar 

  13. Hanson SJ, Pratt LY (1989) Comparing biases for minimal network construction with back propagation. In: Touretzky DS (ed) Advances in neural information processing systems, vol 1. Morgan Kaufmann, San Mateo, pp 177–185

    Google Scholar 

  14. Mozer MC, Smolensky P (1989) Skeletonization: a technique for trimming the fat from a network via relevance assesment. In: Touretzky DS (ed) Advances in neural processing systems, vol 1. Morgan Kaufmann, San Mateo, pp 107–115

    Google Scholar 

  15. Nelson MM, Illingworth WT (1991) A practical guide to neural nets, vol 1. Addison-Wesley, Reading, MA

  16. Wong WK, Guo ZX (2010) A hybrid intelligent model for medium-term sales forecasting in fashion retail supply chains using extreme learning machine and harmony search algorithm. Int J Prod Econ 128:614–624

    Article  Google Scholar 

  17. Weigend AS, Rumelhart DE, Huberman BA (1991) Generalization by weight-elimination with application to forecasting. In: Lippmann RP, Moody J, Touretzky DS (eds) Advances in neural information processing systems 3. Morgan Kaufmann, San Mateo

    Google Scholar 

  18. Shahjahan MD, Kazuyuki M (2005) Neural network training algorithm with possitive correlation. IEEE Trans Inf Syst 88:2399–2409

    Article  Google Scholar 

  19. Michaelewicz Z (1996) Genetic algorithms + Data structures = Evolution programs. Springer, Berlin

    Book  Google Scholar 

  20. Treadgold NK, Gedeon TD (1998) Simulated annealing and weight decay in adaptive learning: the SARPROP algorithm. IEEE Trans Neural Netw 9:662–668

    Article  Google Scholar 

  21. Gupta A, Lam M (1998) The weight decay backpropagation for generalizations with missing values. Annal Oper Res 78:165–187

    Article  MATH  Google Scholar 

  22. Powell MJD (1989) A tolerant algorithm for linearly constrained optimization calculations. Math Program 45:547

    Article  MathSciNet  MATH  Google Scholar 

  23. Fletcher R (1970) A new approach to variable metric algorithms. Comput J 13:317–322

    Article  MATH  Google Scholar 

  24. Leung CS, Wong KW, Sum PF, Chan LW (2001) A pruning method for the recursive least squared algorithm. Neural Netw 14:147–174

    Article  Google Scholar 

  25. Simonoff JS (2012) Smoothing methods in statistics. Springer

  26. Harrison D, Rubinfeld DL (1978) Hedonic housing prices and the demand for clean air. J Environ Econo Manag 5:81–102

    Article  MATH  Google Scholar 

  27. Zhao S, Liu Q, Zhang B (2010) A fast OBS pruning algorithm based on pseudo-entropy of weights. In: 2010 2nd International conference on advanced computer control (ICACC), vol 3, pp 451–455

  28. Nissen S (2003) Implementation of a fast artificial neural network library (FANN). University report, Department of Computer Science, University of Copenhagen

  29. Klima G, Fast compressed neural networks. http://fcnn.sourceforge.net/

  30. Mackowiak PA, Wasserman SS, Levine MM (1992) A critical appraisal of 98.6 degrees f, the upper limit of the normal body temperature, and other legacies of Carl Reinhold August Wunderlich. J Am Med Assoc 268:1578–1580

    Article  Google Scholar 

  31. Andrzejak RG, Lehnertz K, Mormann F, Rieke C, David P, Elger CE (2001) Indications of nonlinear deterministic and finite-dimensional structures in time series of brain electrical activity: dependence on recording region and brain state. Phys Rev E 64(6):8 (Article ID 061907)

    Article  Google Scholar 

  32. Tzallas AT, Tsipouras MG, Fotiadis DI (2007) Automatic seizure detection based on time-frequency analysis and artificial neural networks. Comput Intell Neurosci 2007:13. doi:10.1155/2007/80510 (Article ID 80510)

    Article  Google Scholar 

  33. Giannakeas N, Tsipouras MG, Tzallas AT, Kyriakidi K, Tsianou ZE, Manousou P, Hall A, Karvounis EC, Tsianos V, Tsianos E (2015) A clustering based method for collagen proportional area extraction in liver biopsy images. In: Proceedings of the annual international conference of the IEEE engineering in medicine and biology society, EMBS, Nov 2015, art. no. 7319047, pp 3097–3100

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ioannis G. Tsoulos.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Tsoulos, I.G., Tzallas, A. & Tsalikakis, D. Evolutionary Based Weight Decaying Method for Neural Network Training. Neural Process Lett 47, 463–473 (2018). https://doi.org/10.1007/s11063-017-9660-0

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11063-017-9660-0

Keywords

Navigation