Abstract
In this paper, we propose a novel regularization criterion for robust classifiers. The criterion can produce many types of regularization terms by selecting an appropriate weighting function. L2 regularization terms, which are used for support vector machines (SVMs), can be produced with this criterion when the norm of patterns is normalized. In this regard, we propose two novel regularization terms based on the new criterion for a variety of applications. Furthermore, we propose new classifiers by applying these regularization terms to conventional SVMs. Finally, we conduct an experiment to demonstrate the advantages of these novel classifiers.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Benson, H., Vanderbei, R.: Solving problems with semidefinite and related constraints using interior-point methods for nonlinear programming (2002)
Bjorck, A.: Numerical methods for least squares problems. Mathematics of Computation (1996)
Canu, S., Smola, A.: Kernel methods and the exponential family. Neurocomputing 69, 714–720 (2005)
Chen, W.S., Yuen, P., Huang, J., Dai, D.Q.: Kernel machine-based one-parameter regularized fisher discriminant method for face recognition. IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics 35(4), 659–669 (2005)
Huber, P.J.: Robust Statistics. Wiley, New York (1981)
Mika, S., Ratsch, G., Weston, J., Scholkopf, B., Mullers, K.: Fisher discriminant analysis with kernels. In: Proceedings of the 1999 IEEE Signal Processing Society Workshop Neural Networks for Signal Processing IX, pp. 41–48 (August 1999)
Moraru, V.: An algorithm for solving quadratic programming problems. Computer Science Journal of Moldova 5(2), 223–235 (1997)
Rätsch, G., Onoda, T., Müller, K.: Soft margins for adaboost. Tech. Rep. NC-TR-1998-021, Royal Holloway College. University of London, UK 42(3), 287–320 (1998)
Rennie, J.D.M.: Maximum-margin logistic regression (February 2005), http://people.csail.mit.edu/jrennie/writing
Smola, A.J., Schölkopf, B.: A tutorial on support vector regression. Statistics and Computing 14, 199–222 (2004)
Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society 58(1), 267–288 (1996)
Vapnik, V.: Statistical Learning Theory. Wiley, New York (1998)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2011 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Yokota, T., Yamashita, Y. (2011). Support Vector Machines with Weighted Regularization. In: Lu, BL., Zhang, L., Kwok, J. (eds) Neural Information Processing. ICONIP 2011. Lecture Notes in Computer Science, vol 7063. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-24958-7_55
Download citation
DOI: https://doi.org/10.1007/978-3-642-24958-7_55
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-24957-0
Online ISBN: 978-3-642-24958-7
eBook Packages: Computer ScienceComputer Science (R0)