Abstract
Support Vector Regression (SVR) is usually pursued using the ε–insensitive loss function while, alternatively, the initial regression problem can be reduced to a properly defined classification one. In either case, slack variables have to be introduced in practical interesting problems, the usual choice being the consideration of linear penalties for them. In this work we shall discuss the solution of an SVR problem recasting it first as a classification problem and working with square penalties. Besides a general theoretical discussion, we shall also derive some consequences for regression problems of the coefficient structure of the resulting SVMs and illustrate the procedure on some standard problems widely used as benchmarks and also over a wind energy forecasting problem.
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Bennett, K., Bredensteiner, E.: Geometry in learning. In: Gorini, C., Hart, E., Meyer, W., Phillips, T. (eds.) Geometry at Work, Mathematical Association of America (1997)
Bi, J., Bennett, K.: A geometric approach to support vector regression. Neurocomputing 55, 187–220 (2003)
Burges, C., Crisp, D.: Uniqueness theorems for kernel methods. Neurocomputing 55, 187–220 (2003)
Chang, C., Lin, C.: LIBSVM: a library for support vector machines, http://www.csie.ntu.edu.tw/~cjlin/LIBSVM
European Centre for Medium-Range Weather Forecasts, http://www.ecmwf.int
Franc, V., Hlaváč, V.: An iterative algorithm learning the maximal margin classifier. Pattern Recognition 36, 1985–1996 (2003)
Prechelt, L.: Proben1 - A Set of Neural Network Benchmark Problems and Benchmarking Rules, http://digbib.ubka.uni-karlsruhe.de/eva/ira/1994/21
Schölkopf, B., Smola, A., Williamson, R., Bartlett, P.: New support vector algorithms. Neural Computation 12, 1083–1121 (2000)
Smola, A., Schölkopf, B.: A tutorial on support vector regression. NeuroCOLT2 Technical Report NC2-TR-1998-030 (1998)
Parque Eólico Experimental Sotavento, http://www.sotaventogalicia.com
University of California Irvine: UCI-benchmark repository of machine learning data sets, http://www.ics.uci.edu
Vapnik, V.: The Nature of Statistical Learning Theory. Springer, Berlin (1995)
Vijayakumar, S., Wu, S.: Sequential Support Vector Classifiers and Regression. In: SOCO 1999. Proc. International Conference on Soft Computing, pp. 610–619 (1999)
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 2007 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Barbero, Á., López, J., Dorronsoro, J.R. (2007). Square Penalty Support Vector Regression. In: Yin, H., Tino, P., Corchado, E., Byrne, W., Yao, X. (eds) Intelligent Data Engineering and Automated Learning - IDEAL 2007. IDEAL 2007. Lecture Notes in Computer Science, vol 4881. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-77226-2_55
Download citation
DOI: https://doi.org/10.1007/978-3-540-77226-2_55
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-77225-5
Online ISBN: 978-3-540-77226-2
eBook Packages: Computer ScienceComputer Science (R0)