Abstract
An associative neural network (ASNN) is a combination of an ensemble of the feed-forward neural networks and the K-nearest neighbor technique. The introduced network uses correlation between ensemble responses as a measure of distance among the analyzed cases for the nearest neighbor technique and provides an improved prediction by the bias correction of the neural network ensemble both for function approximation and classification. Actually, the proposed method corrects a bias of a global model for a considered data case by analyzing the biases of its nearest neighbors determined in the space of calculated models. An associative neural network has a memory that can coincide with the training set. If new data become available the network can provide a reasonable approximation of such data without a need to retrain the neural network ensemble. Applications of ASNN for prediction of lipophilicity of chemical compounds and classification of UCI letter and satellite data set are presented. The developed algorithm is available on-line at http://www.virtuallaboratory.org/lab/asnn.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
Dasarthy, B.: Nearest neighbor (NN) norms, IEEE Computer Society Press, Washington, DC, 1991.
Härdle, W.: Smoothing techniques with implementation in S, Springer-Verlag, New York, 1990.
Lawrence, S., Tsoi, A. C. and Back, A. D.: Function approximation withneural networks and local methods: bias, variance and smoothness, In: P. Bartlett, A. Burkitt and R. Williamson (eds), Australian Conference on Neural Networks, Australian NationalUniversity, Australian National University, 1996, pp. 16–21.
Geman, S., Bienenstock, E. and Doursat, R.: Neural networks and the bias/variance dilemma, Neural Computation 4 (1992), 1–58.
Press, W. H., Teukolsky, S. A., Vetterling, W. T. and Flannery, B. P.: Numerical Recipes in C, Cambridge University Press, New York, 1994.
Tetko, I. V., Livingstone, D. J. and Luik, A. I.: Neural network studies. 1. Comparison of overfitting and overtraining, Journal of Chemical Information & Computer Sciences 35 (1995), 826–833.
Bishop, M.: Neural Networks for Pattern Recognition, Oxford University Press, Oxford, 1995.
Tetko, I. V. and Villa, A. E. P.: Efficient partition of learning data sets for neural network training, Neural Networks 10 (1997), 1361–1374.
Schwenk, H. and Bengio, Y.: Boosting neural networks, Neural Computation 12 (2000), 1869–1887.
Marcus, G. F.: Rethinking eliminates connectionism, Cognitive Psychology 37 (1998), 243–282.
Blake, E. K. and Merz, C. UCI repository of machine learning databases, http://www.ics.uci.edu/~mlearn/MLRepository.html, 1998.
Tetko, I. V., Tanchuk, V. Y. and Villa, A. E. P.: Prediction of n-octanol/water partition coefficients from physprop database using artificial neural networks and E-state indices, J. Chem. Inf. Comput. Sci. 41 (2001), 1407–1421.
Tetko, I. V. and Villa, A. E. P.: An efficient partition of training data set improves speed and accuracy of cascade-correlation algorithm, Neural Processing Letters 6 (1997), 51–59.
Fahlman, S. and Lebiere, C.: The cascade-correlation learning architecture, NIPS 2 (1990), 524–532.
Schwenk, H. and Bengio, Y. Adaptive boosting of neural networks for character recognition, Universitè de Montrèal, Montrèal, 1997, pp. 1–9.
Tetko, I. V., Tanchuk, V. Y., Kasheva, T. N. and Villa, A. E.: Internet software for the calculation of the lipophilicity and aqueous solubility of chemical compounds, Journal of Chemical Information & Computer Sciences 41 (2001), 246–252.
Tetko, I. V. and Tanchuk, V. Y.: Application of associative neural networks for prediction of lipophilicity in ALOGPS 2.1 program, Journal of Chemical Information & Computer Sciences in press (2002).
Tetko, I. V.: Neural network studies. 4. Introduction to associative neural networks, Journal of Chemical Information & Computer Sciences 42 (2002), 717–728.
Abeles, M.: Corticotronics: Neural circuits of the cerebral cortex, Cambridge University Press, New York, 1991.
Villa, A. E. P., Tetko, I. V., Hyland, B. and Najem, A.: Spatiotemporal activity patterns of rat cortical neurons predict responses in a conditioned task, Proceedings of the National Academy of Sciences of the Unites States of America 96 (1999), 1106–1111.
Thorpe, S., Fize, D. and Marlot, C.: Speed of processing in the human visual system, Nature 381 (1996), 520–522.
Gautrais, J. and Thorpe, S.: Rate coding versus temporal order coding: a theoretical approach, Biosystems 48 (1998), 57–65.
Kohonen, T.: Self-Organizing Maps, Springer, Berlin, 2001.
Vapnik, V.: The Nature of Statistical Learning Theory, Springer-Verlag, New York, 1995.
Tetko, I. V., Aksenova, T. I., Volkovich, V. V., Kasheva, T. N., Filipov, D. V., Welsh, W. J., Livingstone, D. J. and Villa, A. E. P.: Polynomial neural network for linear and non-linear model selection in quantitative-structure activity relationship studies on the Internet, SAR & QSAR in Environmental Research 11 (2000), 263–280.
Breiman, L.: Arcing classifiers, Annals of Statistics 26 (1998), 801–824.
Freund, Y. and Schapire, R. E.: Experiments with a new boosting algorithm, In: L. Saitta (ed.), Machine Learning: Proceedings of the Thirteen National Conference, Morgan Kaufmann, 1996, pp. 148–156.
Author information
Authors and Affiliations
Rights and permissions
About this article
Cite this article
Tetko, I.V. Associative Neural Network. Neural Processing Letters 16, 187–199 (2002). https://doi.org/10.1023/A:1019903710291
Issue Date:
DOI: https://doi.org/10.1023/A:1019903710291