Abstract
Improving the training algorithm, determining near-optimal number of nonlinear principal components (NLPCs), extracting meaningful NLPCs, and increasing the nonlinear, dynamic, and selective processing capability of the standard autoassociative neural network are the objectives of this article that are achieved independently by some new refinements of the network structure and the training algorithm. In addition, three different topologies of the network are presented, which make it possible to perform local nonlinear principal component analysis. Performances of all methods are evaluated by a stock price database that demonstrates their efficiency in different situations. Finally, as it will be illustrated in the last section, the proposed structures can be easily combined together, which introduce them as efficient tools in a wide range of signal processing applications.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
Haykin S (1999) Neural networks—a comprehensive foundation. Prentice Hall, Upper Saddle River, NJ
Rega G, Troger H (2005) Dimension reduction of dynamical systems: methods, models, applications. Nonlinear Dyn 41(1–3):1–15
Wen XM (2007) A note on sufficient dimension reduction. Stat Probab Lett 77(8):817–821
Ture M, Kurt I, Akturk Z (2007) Comparison of dimension reduction methods using patient satisfaction data. Expert Syst Appl 32(2):422–426
Oja E, Harmeling S, Almeida L (2004) Independent component analysis and beyond. Signal Process 84(2):215–216
Stone JV (2002) Independent component analysis: an introduction. Trends Cogn Sci 6(2):59–64
Frolov AA, Husek D, Muraviev IP, Polyakov PY (2007) Boolean factor analysis by attractor neural network. IEEE Trans Neural Netw 18(3):698–707
Hartigan JA (1975) Clustering algorithms. Wiley, New York
Lappalainen H, Honkela A (2000) Bayesian nonlinear independent component analysis by multi-Layer perceptrons. In: Girolami M (ed) Advances in independent component analysis. Springer, New York, pp 93–121
Bailing Z, Minyue F, Hong Y (2001) A nonlinear neural network model of mixture of local principal component analysis: application to handwritten digits recognition. Pattern Recogn 34(2):203–214
Duda R, Hart P (1988) Pattern classification theory and systems. Springer, Berlin
Makki B, Salehi SA, Sadati N, Noori Hosseini M (2007) Voice conversion using nonlinear principal component analysis. In: IEEE symposium on computational intelligence in image and signal processing, pp 336–339
Sanger T (1989) Optimal unsupervised learning in a single-layer linear feedforward neural network. Neural Netw 2:459–473
DenWux T, Masson MH (2004) Principal component analysis of fuzzy data using autoassociative neural networks. IEEE Trans Fuzzy Syst 12:336–349
López-Rubio E, Muñoz-Pérez J, Gómez-Ruiz J (2004) A principal components analysis self-organizing map. Neural Netw 17(2):261–270
Voegtlin T (2005) Recursive principal components analysis. Neural Netw 18(8):1051–1063
Embrechts MJ, Hargis BJ, Linton JD (2010) Augmented efficient BackProp for backpropagation learning in deep autoassociative neural networks. In: The international joint conference on neural networks (IJCNN), pp 1–6
Chen M-H (2010) Pattern recognition of business failure by autoassociative neural networks in considering the missing values. In: The international computer symposium, pp 711–715
Licciardi G, Del Frate F, Duca R (2009) Feature reduction of hyperspectral data using Autoassociative neural networks algorithms. In: IEEE international geoscience and remote sensing symposium, pp 176–179
Kumar Gellaboina M, Venkoparao VG (2009) Graphic symbol recognition using auto-associative neural network model. In: The seventh international conference on advances in pattern recognition, pp 297–301
Makki B, Noori HosseiniM, Seyyedsalehi SA (2010) An evolving neural network to perform dynamic principal component analysis. Neural Comput Appl 19(3):459–463
Kramer MA (1991) Nonlinear principal component analysis using autoassociative neural networks. AIChE J 37:233–243
Schoelkopf B, Smola A (1998) Nonlinear component analysis as a kernel eigenvalue problem. Neural Comput 10:1299–1319
Stamkopoulos T, Diamantaras K, Maglaveras N, Strintzis M (1998) ECG analysis using nonlinear PCA neural networks for ischemia detection. IEEE Trans Signal Proc 46(11):3058–3067
Hsieh WW (2001) Nonlinear principal component analysis by neural networks. Tellus 53A:599–615
Rosipal R, Girolami M (2001) An expectation-maximization approach to nonlinear component analysis. Neural Comput 13:505–510
Zheng W, Zou C, Zhao L (2005) An improved algorithm for kernel principal component analysis. Neural Process Lett 22(1):49–56
Sarma P, Durlofsky LJ, Aziz K (2008) Kernel principal component analysis for efficient, differentiable parameterization of multipoint geostatistics. Math Geosci 40(1):3–32
Guttman L (1941) The quantification of a class of attributes: a theory and a method of scale construction. In: Horst P (ed) The prediction of personal adjustment. Social Science Research Council, New York, pp 319–348
Kruskal JB (1965) Analysis of factorial experiments by estimating monotone transformations of the data. J R Stat Soc B 27:251–263
Shepard RN (1966) Metric structures in ordinal data. J Math Psychol 3:287–315
Young F, Takane Y, de Leeuw J (1978) The principal components of mixed measurement level multivariate data: an alternating least squares method with optimal scaling. Psychometrika 43:279–281
Winsberg S, Ramsay JO (1983) Monotone spline transformations for dimension reduction. Psychometrika 48:575–595
Oja E (1982) A simplified neuron model as a principal component analyzer. J Math Biol 15:267–273
Oja E, Karhunen J (1985) On stochastic approximation of the eigenvectors and eigenvalues of the expectation of a random matrix. J Math Anal Appl 106:69–84
Marseguerra M, Zoia A (2005) The autoassociative neural network in signal analysis: I. The data dimensionality reduction and its geometric interpretation. Ann Nucl Energy 32(11):1191–1206
Saegusa R, Sakano H, Hashimoto S (2004) Nonlinear principal component analysis to preserve the order of principal components. Neurocomputing 61:57–70
Makki B, Salehi SA, Noori Hosseini M, Sadati N (2007) Principal component analysis using constructive neural networks. In: International joint conference on neural networks, pp 558–562
Rios A, Kabuka M (1995) Image compression with a dynamic autoassociative neural network. Math Comput Modell 21(1–2):159–171
Valpola H, Karhunen J (2002) An unsupervised ensemble learning method for nonlinear dynamic state-space models. Neural Comput 14(11):2647–2692
Alvarez-Ramirez J, Ibarra-Valdez C (2001) Modeling stock market dynamics based on conservation principles. Phys A 301(1–4):493–511
Shively PA (2003) The nonlinear dynamics of stock prices. Q Rev Econ Finance 43(3):505–517
Stock price database: http://standard.blogfa.com
Holmström L, Koistinen P (1992) Using additive noise in backpropagation training. IEEE Trans Neural Netw 3:24–38
Myers R, Hancock ER (2001) Empirical modeling of genetic algorithms. Evol Comput 9(4):461–493
Plaut D, Nowlan S, Hinton G (1986) Experiments on learning by back Propagation, technical report CMU-CS-86-126, Department of Computer Science, Carnegie Mellon University, Pittsburgh
Reiser P (1996) Genetic algorithms in engineering systems: innovations and applications. Comput Control Eng J 7(3):144
Vasudevan BG, Gohil BS, Agarwal VK (2004) Backpropagation neural-network-based retrieval of atmospheric water vapor and cloud liquid water from IRS-P4 MSMR. IEEE Trans Geosci Remote Sens 42(5):985–990
Hinton GE, Osindero S, Teh YW (2006) A fast learning algorithm for deep belief nets. Neural Comput 18:1527–1554
Byungjoo Y, Lacher RC (1994) Extracting rules by destructive learning, In: Int. Conf Neural Netw 3:1766–1771
Parekh R, Yang J, Honavar V (2000) Constructive neural-network learning algorithms for pattern classification. IEEE Trans Neural Netw 11(2):436–451
Xin Y (1999) Evolving artificial neural networks. Proc IEEE 87(9):1423–1447
Honda K, Ichihashi H (2004) Linear fuzzy clustering techniques with missing values and their application to local principal component analysis. IEEE Trans Fuzzy Syst 12(2):183–193
Weingessel A, Hornik K (2000) Local PCA algorithms. IEEE Trans Neural Netw 11(6):1242–1250
Zhi-Yong L, Lei X (2003) Topological local principal component analysis. Neurocomputing 55(3–4):739–745
Rajasekaran S, Vijayalakshmi Pai GA (2002) Recurrent neural dynamic models for equilibrium and eigenvalue problems. Math Comput Modell 35(1–2):229–240
Zohdy MA, Karam M, Abdel-Aty MA, Hoda S (1997) A recurrent dynamic neural network for noisy signal representation. Neurocomputing 17(2):77–97
Jolliffe IT (2002) Principal component analysis. Springer, New York
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Makki, B., Hosseini, M.N. Some refinements of the standard autoassociative neural network. Neural Comput & Applic 22, 1461–1475 (2013). https://doi.org/10.1007/s00521-012-0825-5
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00521-012-0825-5