Abstract
For reasoning with uncertain knowledge the use of probability theory has been broadly investigated. This paper proposed a novel probabilistic network named Bayesian-Neural Network (BNN). BNN reduces computational complexity by dividing input attribute set into two parts, each modelled by Bayesian network or Neural network. The outputs produced by different classifiers is then solved in the output space by estimating the class-conditional structural mixtures. Empirical studies on a set of natural domains show that BNN has clear advantages with respect to the generalization ability.
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Lin, W.M., Lin, C.H., Tasy, M.X.: Transformer-fault diagnosis by integrating field data and standard codes with training enhancible adaptive probabilistic network. IEE Proceedings of Generation, Transmission and Distribution 152, 335–341 (2005)
Tseng, C.L., Chen, Y.H., Xu, Y.Y., Pao, H.T., Fu, H.-C.: A self-growing probabilistic decision-based neural network with automatic data clustering. Neurocomputing 61, 21–38 (2004)
Malgorzata, S.A.S.: Probabilistic fault localization in communication systems using belief networks Steinder. IEEE/ACM Transactions on Networking 12, 809–822 (2004)
Specht, D.F.: Probabilistic neural networks. Neural Networks 3, 109–118 (1990)
Kononenko, I.: Semi-naive Bayesian classifier. In: Kodratoff, Y. (ed.) EWSL 1991. LNCS, vol. 482, pp. 206–219. Springer, Heidelberg (1991)
Langley, P., Iba, W., Thompson, K.: An analysis of Bayesian classifiers. In: Proceedings of AAAI 1992, vol. 92, pp. 223–228 (1992)
Friedman, N., Geiger, D., Goldszmidt, M.: Bayesian Network Classifiers. Machine Learning 29, 131–163 (1997)
Pazzani, M.J., Keogh, E.J.: Learning Augmented Bayesian Classifiers: A Comparison of Distribution-Based and Classification-Based Approaches. In: Proceedings of the Seventh International Workshop on Artificial Intelligence and Statistics, pp. 225–230 (1999)
Hendler, J.: Developing hybrid symbolic/connectionist models. In: Advances in Connectionist and Neural Computation Theory, pp. 165–179 (1991)
Sun, R., Soolanan, S.L.A.: Working Notes of the AAAI Workshop on Integrating Neural and Symbolic Processes, pp. 205–217 (1992)
Friedman, N., Goldszmidt, M., Thomas, J.L.: Bayesian Network Classification with Continuous Attributes: Getting the Best of Both Discretization and Parametric Fitting. In: Proceedings of the International Conference on Machine Learning, pp. 179–187 (1998)
Dougherty, J.: Supervised and unsupervied discretization of coninuous features. In: Proceedings of the 12th International Conference on Machine Learning, pp. 194–201 (1995)
Chow, C.K., Liu, C.N.: Approximating Discrete Probability Distributions with Dependence Trees. IEEE Transactions on Information Theory 14, 462–467 (1968)
Hunter, A.: Feature Selection Using Probabilistic Neural Networks. Neural Computing and Applications 9, 124–132 (2000)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2006 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Limin, W. (2006). Learning Bayesian-Neural Network from Mixed-Mode Data. In: King, I., Wang, J., Chan, LW., Wang, D. (eds) Neural Information Processing. ICONIP 2006. Lecture Notes in Computer Science, vol 4232. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11893028_76
Download citation
DOI: https://doi.org/10.1007/11893028_76
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-46479-2
Online ISBN: 978-3-540-46480-8
eBook Packages: Computer ScienceComputer Science (R0)