[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ Skip to main content
Log in

An enhanced KNN-based twin support vector machine with stable learning rules

  • Original Article
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

Among the extensions of twin support vector machine (TSVM), some scholars have utilized K-nearest neighbor (KNN) graph to enhance TSVM’s classification accuracy. However, these KNN-based TSVM classifiers have two major issues such as high computational cost and overfitting. In order to address these issues, this paper presents an enhanced regularized K-nearest neighbor-based twin support vector machine (RKNN-TSVM). It has three additional advantages: (1) Weight is given to each sample by considering the distance from its nearest neighbors. This further reduces the effect of noise and outliers on the output model. (2) An extra stabilizer term was added to each objective function. As a result, the learning rules of the proposed method are stable. (3) To reduce the computational cost of finding KNNs for all the samples, location difference of multiple distances-based K-nearest neighbors algorithm (LDMDBA) was embedded into the learning process of the proposed method. The extensive experimental results on several synthetic and benchmark datasets show the effectiveness of our proposed RKNN-TSVM in both classification accuracy and computational time. Moreover, the largest speedup in the proposed method reaches to 14 times.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
£29.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price includes VAT (United Kingdom)

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

Notes

  1. https://github.com/mir-am/LightTwinSVM.

  2. https://www.python.org.

  3. https://gcc.gnu.org.

  4. https://pybind11.readthedocs.io/en/stable/intro.html.

  5. http://archive.ics.uci.edu/ml/datasets.html.

References

  1. Aslahi-Shahri B, Rahmani R, Chizari M, Maralani A, Eslami M, Golkar M, Ebrahimi A (2016) A hybrid method consisting of GA and SVM for intrusion detection system. Neural Comput Appl 27(6):1669–1676

    Google Scholar 

  2. Chen YS, Hung YP, Yen TF, Fuh CS (2007) Fast and versatile algorithm for nearest neighbor search based on a lower bound tree. Pattern Recognit 40(2):360–375

    MATH  Google Scholar 

  3. Cortes C, Vapnik V (1995) Support-vector networks. Mach Learn 20(3):273–297

    MATH  Google Scholar 

  4. Demšar J (2006) Statistical comparisons of classifiers over multiple data sets. J Mach Learn Res 7:1–30

    MathSciNet  MATH  Google Scholar 

  5. Ding S, Yu J, Qi B, Huang H (2014) An overview on twin support vector machines. Artif Intell Rev 42(2):245–252

    Google Scholar 

  6. Ding S, Zhang N, Zhang X, Wu F (2017) Twin support vector machine: theory, algorithm and applications. Neural Comput Appl 28(11):3119–3130

    Google Scholar 

  7. Dudani SA (1976) The distance-weighted k-nearest-neighbor rule. IEEE Trans Syst Man Cybern 4:325–327

    Google Scholar 

  8. Friedman JH, Bentley JL, Finkel RA (1977) An algorithm for finding best matches in logarithmic expected time. ACM Trans Math Softw (TOMS) 3(3):209–226

    MATH  Google Scholar 

  9. Golub GH, Van Loan CF (2012) Matrix computations, vol 3. JHU Press, Baltimore

    MATH  Google Scholar 

  10. Gou J, Du L, Zhang Y, Xiong T et al (2012) A new distance-weighted k-nearest neighbor classifier. J Inf Comput Sci 9(6):1429–1436

    Google Scholar 

  11. Ho T, Kleinberg E (1996) Checkerboard dataset

  12. Hsieh CJ, Chang KW, Lin CJ, Keerthi SS, Sundararajan S (2008) A dual coordinate descent method for large-scale linear SVM. In: Proceedings of the 25th international conference on Machine learning. ACM, pp 408–415

  13. Huang H, Wei X, Zhou Y (2018) Twin support vector machines: a survey. Neurocomputing 300:34–43

    Google Scholar 

  14. Ibrahim HT, Mazher WJ, Ucan ON, Bayat O (2018) A grasshopper optimizer approach for feature selection and optimizing SVM parameters utilizing real biomedical data sets. Neural Comput Appl. https://doi.org/10.1007/s00521-018-3414-4

    Article  Google Scholar 

  15. Jayadeva Khemchandani R, Chandra S (2007) Twin support vector machines for pattern classification. IEEE Trans Pattern Anal Mach Intell 29(5):905

    MATH  Google Scholar 

  16. Lin CF, Wang SD (2002) Fuzzy support vector machines. IEEE Trans Neural Netw 13(2):464–471

    Google Scholar 

  17. Mangasarian OL, Musicant DR (1999) Successive overrelaxation for support vector machines. IEEE Trans Neural Netw 10(5):1032–1037

    Google Scholar 

  18. Mangasarian OL, Wild EW (2001) Proximal support vector machine classifiers. In: Proceedings KDD-2001: knowledge discovery and data mining, Citeseer

  19. Mangasarian OL, Wild EW (2006) Multisurface proximal classification via generalized eigenvalues. IEEE Trans Pattern Anal Mach Intell 28(1):69–74

    Google Scholar 

  20. Mir AM, Nasiri JA (2019) Lighttwinsvm: a simple and fast implementation of standard twin support vector machine classifier. J Open Source Softw 4:1252

    Google Scholar 

  21. Musicant D (1998) Ndc: normally distributed clustered datasets. Computer Sciences Department, University of Wisconsin, Madison

    Google Scholar 

  22. Nasiri JA, Naghibzadeh M, Yazdi HS, Naghibzadeh B (2009) Ecg arrhythmia classification with support vector machines and genetic algorithm. In: Third UKSim European symposium on computer modeling and simulation, (2009) EMS’09. IEEE, pp 187–192

  23. Nasiri JA, Charkari NM, Mozafari K (2014) Energy-based model of least squares twin support vector machines for human action recognition. Signal Process 104:248–257

    Google Scholar 

  24. Nayak J, Naik B, Behera H (2015) A comprehensive survey on support vector machine in data mining tasks: applications & challenges. Int J Database Theory Appl 8(1):169–186

    Google Scholar 

  25. Olatunji SO (2017) Improved email spam detection model based on support vector machines. Neural Comput Appl. https://doi.org/10.1007/s00521-017-3100-y

    Article  Google Scholar 

  26. Pan X, Luo Y, Xu Y (2015) K-nearest neighbor based structural twin support vector machine. Knowl Based Syst 88:34–44

    Google Scholar 

  27. Pang X, Xu C, Xu Y (2018) Scaling knn multi-class twin support vector machine via safe instance reduction. Knowl Based Syst 148:17–30

    Google Scholar 

  28. Pedregosa F, Varoquaux G, Gramfort A, Michel V, Thirion B, Grisel O, Blondel M, Prettenhofer P, Weiss R, Dubourg V et al (2011) Scikit-learn: machine learning in python. J Mach Learn Res 12:2825–2830

    MathSciNet  MATH  Google Scholar 

  29. Peng X, Chen D, Kong L (2014) A clipping dual coordinate descent algorithm for solving support vector machines. Knowl Based Syst 71:266–278

    Google Scholar 

  30. Qi Z, Tian Y, Shi Y (2013) Structural twin support vector machine for classification. Knowl Based Syst 43:74–81

    Google Scholar 

  31. Refahi MS, Nasiri JA, Ahadi S (2018) ECG arrhythmia classification using least squares twin support vector machines. In: Iranian conference on electrical engineering (ICEE). IEEE, pp 1619–1623

  32. Ripley BD (2007) Pattern recognition and neural networks. Cambridge University Press, Cambridge

    MATH  Google Scholar 

  33. Shalev-Shwartz S, Ben-David S (2014) Understanding machine learning: from theory to algorithms. Cambridge University Press, Cambridge

    MATH  Google Scholar 

  34. Shao YH, Zhang CH, Wang XB, Deng NY (2011) Improvements on twin support vector machines. IEEE Trans Neural Netw 22(6):962–968

    Google Scholar 

  35. Sra S, Nowozin S, Wright SJ (2012) Optimization for machine learning. MIT Press, Cambridge

    Google Scholar 

  36. Vapnik VN (1999) An overview of statistical learning theory. IEEE Trans Neural Netw 10(5):988–999

    Google Scholar 

  37. Virtanen P, Gommers R, Oliphant TE, Haberland M, Reddy T, Cournapeau D, Burovski E, Peterson P, Weckesser W, Bright J et al. (2019) SciPy 1.0–fundamental algorithms for scientific computing in Python. arXiv:1907.10121

  38. Walt Svd, Colbert SC, Varoquaux G (2011) The numpy array: a structure for efficient numerical computation. Comput Sci Eng 13(2):22–30

    Google Scholar 

  39. Xia S, Xiong Z, Luo Y, Dong L, Zhang G (2015) Location difference of multiple distances based k-nearest neighbors algorithm. Knowl Based Syst 90:99–110

    Google Scholar 

  40. Xu Y (2016) K-nearest neighbor-based weighted multi-class twin support vector machine. Neurocomputing 205:430–438

    Google Scholar 

  41. Xu Y, Guo R, Wang L (2013) A twin multi-class classification support vector machine. Cogn Comput 5(4):580–588

    Google Scholar 

  42. Ye Q, Zhao C, Gao S, Zheng H (2012) Weighted twin support vector machines with local information and its application. Neural Netw 35:31–39

    MATH  Google Scholar 

Download references

Acknowledgements

Amir M. Mir: Work was done while the author was a master student at the Islamic Azad University (North Tehran Branch). Submitted with approval from Jalal A. Nasiri.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jalal A. Nasiri.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Nasiri, J.A., Mir, A.M. An enhanced KNN-based twin support vector machine with stable learning rules. Neural Comput & Applic 32, 12949–12969 (2020). https://doi.org/10.1007/s00521-020-04740-x

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-020-04740-x

Keywords

Navigation