Abstract
In this paper we construct the linear support vector machine (SVM) based on the nonlinear rescaling (NR) methodology (see [Polyak in Math Program 54:177–222, 1992; Polyak in Math Program Ser A 92:197–235, 2002; Polyak and Teboulle in Math Program 76:265–284, 1997] and references therein). The formulation of the linear SVM based on the NR method leads to an algorithm which reduces the number of support vectors without compromising the classification performance compared to the linear soft-margin SVM formulation. The NR algorithm computes both the primal and the dual approximation at each step. The dual variables associated with the given data-set provide important information about each data point and play the key role in selecting the set of support vectors. Experimental results on ten benchmark classification problems show that the NR formulation is feasible. The quality of discrimination, in most instances, is comparable to the linear soft-margin SVM while the number of support vectors in several instances were substantially reduced.
Similar content being viewed by others
References
Blake, C.L., Merz, C.J.: UCI Repository of machine learning databases. http://www.ics. uci.edu/~mlearn/MLRepository.html. Department of Information and Computer Sciences, University of California, Irvine (1998)
Boser, B.E., Guyon, I., Vapnik, V.: A training algorithm for optimal margin classifiers. In: COLT, pp. 144–152. ACM, Pittsburg (1992)
Burges, C.J.C.: Simplified support vector decision rules. In: ICML, San Fransisco, pp. 71–77 (1996)
Chang, C.-C., Lin, C.-J.: Libsvm: a library for support vector machines (2001)
Cortes C., Vapnik V. (1995) Support-vector networks. Mach. Learn. 20, 273–297
Guyon, I., Matic, N., Vapnik, V.: Discovering informative patterns and data cleaning. In: Advances in knowledge discovery and data mining, pp. 181–203 (1996)
Jensen D., Polyak R. (1994) The convergence of a modified barrier method for convex programming. IBM J. Res. Dev. 38:307–321
Melman A., Polyak R. (1996) The Newton modified barrier method for QP problems. Ann. Oper. Res. 62, 465–519
Nesterov, Y., Nemirovskii, A.: Interior-point Polynomial Algorithms in Convex Programming. Society for Industrial and Applied Mathematics, Philadelphia (1994)
Nguyen, D., Ho, T.B.: An efficient method for simplifying support vector machine. In: Proceedings of the 22nd International Conference on Machine Learning, pp. 617–624 (2005)
Polyak R. (1992) Modified barrier functions (theory and methods). Math. Program. 54, 177–222
Polyak R. (2002) Nonlinear rescaling vs smoothing technique in convex optimization. Math. Program. Ser. A 92, 197–235
Polyak R., Griva I. (2004) Primal-dual nonlinear rescaling method for convex optimization. J. Optim. Theory Appl. 122(1):111–156
Polyak R., Teboulle M. (1997) Nonlinear rescaling and proximal-like methods in convex optimization. Math. Program. 76, 265–284
Rätsch G., Onoda T., Müller K.-R. (2001) Soft margins for adaboost. Mach. Learn. 42, 287–320
Vapnik V.N. (2000) The nature of statistical learning theory, 2nd edn. Springer, Berlin Heidelberg New York
Wu M., Scholkopf B., Bakir G. (2006) A Direct Method for Building Sparse Kernel Learning Algorithms. J. Mach. Learn. Res. 7, 603–624
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Polyak, R., Ho, SS. & Griva, I. Support vector machine via nonlinear rescaling method. Optimization Letters 1, 367–378 (2007). https://doi.org/10.1007/s11590-006-0033-2
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11590-006-0033-2