[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ Skip to main content

Advertisement

Log in

DCSVM: fast multi-class classification using support vector machines

  • Original Article
  • Published:
International Journal of Machine Learning and Cybernetics Aims and scope Submit manuscript

Abstract

Using binary classification techniques to perform multi-class classification of data is still of great practical interest due to the robustness and simplicity of binary classifiers. These techniques produce a single multi-class classification decision based on many binary decisions. Our work relies on the simple observation that as dimensionality increases so does the data sparsity and, consequently, a single binary classifier may separate multiple classes. Therefore, we claim that the number of binary decisions can be significantly reduced. We present Divide and Conquer Support Vector Machines (DCSVM), an efficient algorithm for multi-class classification using Support Vector Machines. DCSVM is a divide and conquer algorithm which relies on data sparsity in high dimensional space and performs a smart partitioning of the whole training data set into disjoint subsets that are easily separable. A single prediction performed between two partitions eliminates at once one or more classes in one partition, leaving only a reduced number of candidate classes for subsequent steps. The algorithm continues recursively, reducing the number of classes at each step, until a final binary decision is made between the last two classes left in the competition. In the best case scenario, our algorithm makes a final decision between k classes in \(O(\log k)\) decision steps and in the worst case scenario DCSVM makes a final decision in \(k{-}1\) steps, which is not worse than the existent techniques.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
£29.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price includes VAT (United Kingdom)

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

References

  1. Agarwal N, Balasubramanian VN, Jawahar C (2018) Improving multiclass classification by deep networks using DAGSVM and Triplet Loss. Pattern Recogn Lett 112:184–190. https://doi.org/10.1016/j.patrec.2018.06.034http://www.sciencedirect.com/science/article/pii/S0167865518302745

    Article  Google Scholar 

  2. Bellman R (2003) Dynamic programming. Dover books on computer science series. Dover Publications, New York

    Google Scholar 

  3. Bellman R, Corporation R (1957) Dynamic programming. Rand corporation research study. Princeton University Press, Princeton

    Google Scholar 

  4. Bishwas AK, Mani A, Palade V (2017) An all-pair approach for big data multiclass classification with quantum SVM. CoRR abs/1704.07664

  5. Bottou L, Cortes C, Denker J, Drucker H, Guyon I, Jackel L, Lecun Y, Muller U, Sackinger E, Simard P, Vapnik V (1994) Comparison of classifier methods: a case study in handwritten digit recognition. In: IAPR (ed) Proceedings of the international conference on pattern recognition, Jerusalem, vol II, pp 77–82, IEEE

  6. Bredensteiner EJ, Bennett KP (1999) Multicategory classification by support vector machines. Comput Optim Appl 12(1–3):53–79

    Article  MathSciNet  Google Scholar 

  7. Cortes C, Vapnik V (1995) Support-vector networks. Mach Learn 20:273–297

    MATH  Google Scholar 

  8. Crammer K, Singer Y (2002) On the algorithmic implementation of multiclass kernel-based vector machines. J Mach Learn Res 2:265–292

    MATH  Google Scholar 

  9. Dheeru D, Taniskidou EK (2017) UCI machine learning repository. http://archive.ics.uci.edu/ml

  10. Dimitriadou E, Hornik K, Leisch F, Meyer D, Weingessel A (2018) e1071: Misc functions of the department of statistics (e1071), TU Wien. R package version 1.7-0. http://cran.r-project.org/package=e1071. Accessed Nov 2018

  11. Friedman JH (1996) Another approach to polychotomous classification. Tech. rep., Department of Statistics, Stanford University. http://www-stat.stanford.edu/~jhf/ftp/poly.ps.Z. Accessed Nov 2018

  12. Galar M, Fernández A, Barrenechea E, Bustince H, Herrera F (2011) An overview of ensemble methods for binary classifiers in multi-class problems: experimental study on one-vs-one and one-vs-all schemes. Pattern Recogn 44(8):1761–1776

    Article  Google Scholar 

  13. Gao T, Koller D (2011) Discriminative learning of relaxed hierarchy for large-scale visual recognition, pp 2072–2079. https://doi.org/10.1109/ICCV.2011.6126481. (Cited By 100)

  14. Hajikhodaverdikhan P, Nazari M, Mohsenizadeh M, Shamshirband S, Chau K wing (2018) Earthquake prediction with meteorological data by particle filter-based support vector regression. Eng Appl Comput Fluid Mech 12(1):679–688

    Google Scholar 

  15. He X, Wang Z, Jin C, Zheng Y, Xue X (2012) A simplified multi-class support vector machine with reduced dual optimization. Pattern Recogn Lett 33:71–82

    Article  Google Scholar 

  16. Hsu CW, Lin CJ (2002) A comparison of methods for multiclass support vector machines. IEEE Trans Neural Netw 13(2):415–425

    Article  Google Scholar 

  17. Joutsijoki H, Juhola M (2012) DAGSVM vs. DAGKNN: an experimental case study with benthic macroinvertebrate dataset. In: Perner P (ed) Machine learning and data mining in pattern recognition. Springer, Berlin, Heidelberg, pp 439–453

    Chapter  Google Scholar 

  18. Knerr S, Personnaz L, Dreyfus G (1990) Single-layer learning revisited: a stepwise procedure for building and training a neural network. In: Fogelman Soulié F, Hérault J (eds) Neurocomputing: algorithms, architectures and applications, NATO ASI Series, vol F68, pp 41–50. Springer

  19. Kreßel UHG (1999) Advances in kernel methods. chap. Pairwise classification and support vector machines. MIT Press, Cambridge, pp 255–268

    Google Scholar 

  20. Moazenzadeh R, Mohammadi B, Shamshirband S, Chau KW (2018) Coupling a firefly algorithm with support vector regression to predict evaporation in northern Iran. Eng Appl Computat Fluid Mech 12(1):584–597

    Google Scholar 

  21. Panda P, Roy K (2016) Attention tree: learning hierarchies of visual features for large-scale image recognition. CoRR abs/1608.00611. http://arxiv.org/abs/1608.00611

  22. Park SH, Fürnkranz J (2007) Efficient pairwise classification. In: Kok JN, Koronacki J, Mantaras RLD, Matwin S, Mladenič D, Skowron A (eds) Machine learning: ECML 2007, pp 658–665. Springer, Berlin, Heidelberg

    Chapter  Google Scholar 

  23. Platt JC, Cristianini N, Shawe-Taylor J (1999) Large margin dags for multiclass classification. In: Proceedings of the 12th international conference on neural information processing systems, NIPS’99, pp 547–553. MIT Press, Cambridge

  24. Rosales-Perez A, Garcia S, Terashima-Marin H, Coello CAC, Herrera F (2018) Mc2esvm: Multiclass classification based on cooperative evolution of support vector machines. IEEE Computat Intell Mag 13(2):18–29

    Article  Google Scholar 

  25. Silva-Palacios D, Ferri C, Ramírez-Quintana MJ (2018) Probabilistic class hierarchies for multiclass classification. J Comput Sci 26:254–263

    Article  Google Scholar 

  26. Szedmak S, Shawe-Taylor J, Saunders C, Hardoon D (2004) Multiclass classification by l1 norm support vector machine. In: Pattern recognition and machine learning in computer vision workshop

  27. Vapnik V (1998) Statistical learning theory. Wiley, New York

    MATH  Google Scholar 

  28. Weston J W C (1999)Multi-class support vector machines. In: Proceedings of the European symposium on artificial neural networks ESANN99, pp 219–224

  29. Wikipedia: Curse of dimensionality (2018) https://en.wikipedia.org/wiki/Curse_of_dimensionality. Accessed Nov 2018

  30. Xu J, Liu X, Huo Z, Deng C, Nie F, Huang H (2017) Multi-class support vector machine via maximizing multi-class margins. In: Proceedings of the 26th international joint conference on artificial intelligence, pp 3154–3160. AAAI Press

Download references

Acknowledgements

We would like to express our appreciation to the anonymous reviewers for their numerous suggestions that improved the quality of this paper. We owe our reviewers not only corrections and improvements but also some interesting future directions to investigate.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ionut E. Iacob.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Don, D.R., Iacob, I.E. DCSVM: fast multi-class classification using support vector machines. Int. J. Mach. Learn. & Cyber. 11, 433–447 (2020). https://doi.org/10.1007/s13042-019-00984-9

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s13042-019-00984-9

Keywords

Mathematics Subject Classification