Abstract
A new version of one-class classification criterion robust to anomalies in the training dataset is proposed based on support vector data description (SVDD). The original formulation of the problem is not geometrically correct, since the value of the penalty for the admissible escape of the training sample objects outside the describing hypersphere is incommensurable with the distance to its center in the optimization problem and the presence of outliers can greatly affect the decision boundary. The proposed criterion is intended to eliminate this inconsistency. The equivalent form of criterion without constraints lets us use a kernel-based approach without transition to the dual form to make a flexible description of the training dataset. The substitution of the non-differentiable objective function by the smooth one allows us to apply an algorithm of sequential optimizations to solve the problem. We apply the Jaccard measure for a quantitative assessment of the robustness of a decision rule to the presence of outliers. A comparative experimental study of existing one-class methods shows the superiority of the proposed criterion in anomaly detection.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Moya, M.M., Koch, M.W., Hostetler, L.D.: One-class classifier networks for target recognition applications. In: Proceeding WCNN 1993, World Congress on Neural Networks, vol. 3, pp. 797–801 (1993)
Bekkerd J., Davis J.: Learning from positive and unlabeled data: a survey. Mach. Learn., 109(4), 719–760. Springer, US (2020)
Xu, D., et al.: Learning deep representations of appearance and motion for anomalous event detection. In: Procedings of the British Machine Vision Conference 2015, pp. 1–12. British Machine Vision Association (2015)
Chandola, V., Banerjee, A., Kumar, V.: Anomaly detection: a survey. ACM Comput. Surv. 41(3), 1–58 (2009)
Larin, Aleksandr., et al.: Parametric representation of objects in color space using one-class classifiers. In: Perner, P. (ed.) MLDM 2014. LNCS (LNAI), vol. 8556, pp. 300–314. Springer, Cham (2014). https://doi.org/10.1007/978-3-319-08979-9_23
Shi, L.-F.F., et al.: Removing haze particles from single image via exponential inference with support vector data description. IEEE Trans. Multimedia 20(9), 2503–2512 (2018)
Kopylov, A., et al.: Background-invariant robust hand detection based on probabilistic one-class color segmentation and skeleton matching. In: ICPRAM 2018 - Proceedings of the 7th International Conference on Pattern Recognition Applications and Methods. SCITEPRESS - Science and Technology Publications, vol. 2018, pp. 503–510, January (2018)
Khan, S.S., Hoey, J.: Review of fall detection techniques: a data availability perspective. Med. Eng. Phys. 39, 12–22. Elsevier Ltd (2017)
Tarassenko, L., et al.: Novelty detection for the identification of masses in mammograms. In: IEE Conference Publication, no. 409, pp. 440–447. IEEE (1995)
Duda, R.O., Hart, P.E., Stork, D.G.: Pattern Classification. Wiley, Hoboken (2012)
Parzen, E.: On estimation of a probability density function and mode. Ann. Math. Stat. 33(3), 1065–1076 (1962)
Juszczak, P., et al.: Minimum spanning tree based one-class classifier. Neurocomputing 72(7–9), 1859–1869 (2009)
Vapnik, V.N.: Statistical learning theory. In: Haykin, S. (ed.) Interpreting, vol. 2, № 4, p. 736. Wiley, Hoboken (1998)
Schölkopf, B., et al.: Estimating the support of a high-dimensional distribution. In: Neural Computation, vol. 13, № 7, pp. 1443–1471. MIT Press, 238 Main St., Suite 500, Cambridge, MA 02142–1046, USA (2001). journals-info@mit.edu
Tax, D.M.J.: One-class Classification. Concept-learning in the Absence of Counter-Examples. Delft University of Technology (2001). 202 p.
Gornitz, N., et al.: Support vector data descriptions and k-means clustering: one class? IEEE Trans. Neural Netw. Learn. Syst. 29(9), 3994–4006 (2018)
Duin, R.P.W., de Ridder, D., Tax, D.M.J.: Experiments with a featureless approach to pattern recognition. Pattern Recogn. Lett. 18(11–13), 1159-1166 (1997)
Mottl, V., Dvoenko, S., Seredin, O., Kulikowski, C., Muchnik, I.: Featureless pattern recognition in an imaginary hilbert space and its application to protein fold classification. In: Perner, P. (ed.) MLDM 2001. LNCS (LNAI), vol. 2123, pp. 322–336. Springer, Heidelberg (2001). https://doi.org/10.1007/3-540-44596-X_26
Chang, W., Lee, C., Lin, C.: A revisit to support vector data description (SVDD), W.Csie.Org, № 1, pp. 1–20 (2013)
Nesterov, Y.: Smooth minimization of non-smooth functions. Math. Program. 103(1), 127–152 (2005). https://doi.org/10.1007/s10107-004-0552-5
Liu, B.: On smoothing exact penalty functions for nonlinear constrained optimization problems. J. Appl. Math. Comput. 30(1–2), 259–270 (2009). https://doi.org/10.1007/s12190-008-0171-z
Gramfort, A., Thomas, A.: Comparing anomaly detection algorithms for outlier detection on toy datasets [Элeктpoнный pecypc], scikit-learn 0.20.3 documentation, pp. 2–5 (2019). https://scikit-learn.org/stable/auto_examples/miscellaneous/plot_anomaly_comparison.html#sphx-glr-auto-examples-miscellaneous-plot-anomaly-comparison-py. Accessed 20 Oct 2020
Chang, C.C., Lin, C.J.: LIBSVM: a library for support vector machines. ACM Trans. Intell. Syst. Technol. 2(3), 1–27 (2011)
Jaccard, P.: Etude comparative de la distribution florale dans une portion des Alpes et des Jura. Bull. Soc. Vaudoise Sci. Nat. 37, 547–579 (1901)
Acknowledgements
The work is supported by the Russian Fund for Basic Research. Grant no: 18-07-00942 and Grant no: 20-07-00441.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 Springer Nature Switzerland AG
About this paper
Cite this paper
Larin, A.O., Seredin, O.S., Kopylov, A.V. (2021). One-Class Classification Criterion Robust to Anomalies in Training Dataset. In: Del Bimbo, A., et al. Pattern Recognition. ICPR International Workshops and Challenges. ICPR 2021. Lecture Notes in Computer Science(), vol 12665. Springer, Cham. https://doi.org/10.1007/978-3-030-68821-9_15
Download citation
DOI: https://doi.org/10.1007/978-3-030-68821-9_15
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-68820-2
Online ISBN: 978-3-030-68821-9
eBook Packages: Computer ScienceComputer Science (R0)