Abstract
The kernel minimum squared error estimation (KMSE) model can be viewed as a general framework that includes kernel Fisher discriminant analysis (KFDA), least squares support vector machine (LS-SVM), and kernel ridge regression (KRR) as its particular cases. For continuous real output the equivalence of KMSE and LS-SVM is shown in this paper. We apply standard methods for computing prediction intervals in nonlinear regression to KMSE model. The simulation results show that LS-SVM has better performance in terms of the prediction intervals and mean squared error(MSE). The experiment on a real date set indicates that KMSE compares favorably with other method.
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Chryssolouris, G., Lee, M., Ramsey, A.: Confidence interval prediction for neural network models. IEEE Trans. Neural Networks 7, 229–232 (1996)
De Veaux, R.D., Psichogios, D.C., Ungar, L.H.: A comparison of two nonparametric estimation schemes: MARS and neural networks. Computers in Chemical Engineering 17, 819–837 (1993)
De Veaux, R.D., Schumi, J., Schweinsberg, J., Ungar, L.H.: Prediction interval for neural networks via nonlinear regression. Technometrics 40, 273–282 (1998)
Mika, S., Ratsch, G., Weston, J., Scholkopf, B., Muller, K.-R.: Fisher discriminant analysis with kernels. In: Neural Networks for Signal Processing, vol. IX, pp. 41–48. IEEE press, New York (1999)
Saunders, C., Gammerman, A., Vork, V.: Ridge regression learning algorithm in dual variable. In: Proceedings of the 15th International Conference on Machine Learning, pp. 515–521 (1998)
Seok, K., Hwang, C., Cho, D.: Prediction intervals for support vector machine regression. Communications in Statistics: Theory and Methods 31, 1887–1898 (2002)
Shao, R., Martin, E.B., Zhang, J., Morris, A.J.: Connfidence bounds for neural network representations. Computers Chem. Engng. 21(suppl.), S1173–S1178 (1997)
Suykens, J.A.K., Lukas, L., Van Dooren, P., De Moor, B., Vandewalle, J.: Least squares support vector machine classifiers: a large scale algorithm. In: European Conference on Circuit Theory and Design, ECCTD 1999, Stresa Italy, August 1999, pp. 839–842 (1999)
Suykens, J.A.K., Vandewalle, J.: Least squares support vector machine classifiers. Neural Processing Letters 9, 293–300 (1998)
Suykens, J.A.K., Van Gestel, T., De Brabanter, J.D., De Moor, B., Vandewalle, J.: Least Squares Support Vector Machines. World Scientific, Singapore (2002)
Vapnik, V.: Statistical Learning Theory. Springer, New York (1998)
Xu, J., Zhang, X., Li, Y.: Kernel MSE algorithm: A unified framework for KFD, LS-SVM. In: Proceedings of IJCNN 2001, vol. 2, pp. 1486–1491 (2001)
Yang, L., Kavli, T., Carlin, M., Clausen, S., de Groot, P.F.M.: An evaluation of confidence bound estimation methods for neural networks. In: Proceeding of ESIT, pp. 322–329 (2000)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2005 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Hwang, C., Seok, K.H., Cho, D. (2005). A Prediction Interval Estimation Method for KMSE. In: Wang, L., Chen, K., Ong, Y.S. (eds) Advances in Natural Computation. ICNC 2005. Lecture Notes in Computer Science, vol 3610. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11539087_69
Download citation
DOI: https://doi.org/10.1007/11539087_69
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-28323-2
Online ISBN: 978-3-540-31853-8
eBook Packages: Computer ScienceComputer Science (R0)