[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ Skip to main content
Log in

Support vector machine via nonlinear rescaling method

  • Original Paper
  • Published:
Optimization Letters Aims and scope Submit manuscript

Abstract

In this paper we construct the linear support vector machine (SVM) based on the nonlinear rescaling (NR) methodology (see [Polyak in Math Program 54:177–222, 1992; Polyak in Math Program Ser A 92:197–235, 2002; Polyak and Teboulle in Math Program 76:265–284, 1997] and references therein). The formulation of the linear SVM based on the NR method leads to an algorithm which reduces the number of support vectors without compromising the classification performance compared to the linear soft-margin SVM formulation. The NR algorithm computes both the primal and the dual approximation at each step. The dual variables associated with the given data-set provide important information about each data point and play the key role in selecting the set of support vectors. Experimental results on ten benchmark classification problems show that the NR formulation is feasible. The quality of discrimination, in most instances, is comparable to the linear soft-margin SVM while the number of support vectors in several instances were substantially reduced.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
£29.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price includes VAT (United Kingdom)

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Blake, C.L., Merz, C.J.: UCI Repository of machine learning databases. http://www.ics. uci.edu/~mlearn/MLRepository.html. Department of Information and Computer Sciences, University of California, Irvine (1998)

  2. Boser, B.E., Guyon, I., Vapnik, V.: A training algorithm for optimal margin classifiers. In: COLT, pp. 144–152. ACM, Pittsburg (1992)

  3. Burges, C.J.C.: Simplified support vector decision rules. In: ICML, San Fransisco, pp. 71–77 (1996)

  4. Chang, C.-C., Lin, C.-J.: Libsvm: a library for support vector machines (2001)

  5. Cortes C., Vapnik V. (1995) Support-vector networks. Mach. Learn. 20, 273–297

    MATH  Google Scholar 

  6. Guyon, I., Matic, N., Vapnik, V.: Discovering informative patterns and data cleaning. In: Advances in knowledge discovery and data mining, pp. 181–203 (1996)

  7. Jensen D., Polyak R. (1994) The convergence of a modified barrier method for convex programming. IBM J. Res. Dev. 38:307–321

    Article  MATH  Google Scholar 

  8. Melman A., Polyak R. (1996) The Newton modified barrier method for QP problems. Ann. Oper. Res. 62, 465–519

    Article  MATH  MathSciNet  Google Scholar 

  9. Nesterov, Y., Nemirovskii, A.: Interior-point Polynomial Algorithms in Convex Programming. Society for Industrial and Applied Mathematics, Philadelphia (1994)

  10. Nguyen, D., Ho, T.B.: An efficient method for simplifying support vector machine. In: Proceedings of the 22nd International Conference on Machine Learning, pp. 617–624 (2005)

  11. Polyak R. (1992) Modified barrier functions (theory and methods). Math. Program. 54, 177–222

    Article  MATH  MathSciNet  Google Scholar 

  12. Polyak R. (2002) Nonlinear rescaling vs smoothing technique in convex optimization. Math. Program. Ser. A 92, 197–235

    Article  MATH  MathSciNet  Google Scholar 

  13. Polyak R., Griva I. (2004) Primal-dual nonlinear rescaling method for convex optimization. J. Optim. Theory Appl. 122(1):111–156

    Article  MATH  MathSciNet  Google Scholar 

  14. Polyak R., Teboulle M. (1997) Nonlinear rescaling and proximal-like methods in convex optimization. Math. Program. 76, 265–284

    MathSciNet  Google Scholar 

  15. Rätsch G., Onoda T., Müller K.-R. (2001) Soft margins for adaboost. Mach. Learn. 42, 287–320

    Article  MATH  Google Scholar 

  16. Vapnik V.N. (2000) The nature of statistical learning theory, 2nd edn. Springer, Berlin Heidelberg New York

    MATH  Google Scholar 

  17. Wu M., Scholkopf B., Bakir G. (2006) A Direct Method for Building Sparse Kernel Learning Algorithms. J. Mach. Learn. Res. 7, 603–624

    MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Shen-Shyang Ho.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Polyak, R., Ho, SS. & Griva, I. Support vector machine via nonlinear rescaling method. Optimization Letters 1, 367–378 (2007). https://doi.org/10.1007/s11590-006-0033-2

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11590-006-0033-2

Keywords

Navigation