Abstract
The theoretical framework of Statistical Learning Theory (SLT) for pattern recognition problems is extended to comprehend the situations where an infinite value of the loss function is employed to prevent misclassifications in specific regions with high reliability.
Sufficient conditions for ensuring the consistency of the Empirical Risk Minimization (ERM) criterion are then established and an explicit bound, in terms of the VC dimension of the class of decision functions employed to solve the problem, is derived.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Vapnik, V. N. (1982). Estimation of Dependences Based on Empirical Data. New York: Springer-Verlag.
Vapnik, V. N. (1998). Statistical Learning Theory. York: John Wiley & Sons.
Vapnik, V. N. and Chervonenkis, A. Ya. (1971). On the uniform convergence of relative frequencies of events to their probabilities. Theory of Probability and Its Applications, pages 264–280.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2005 Springer
About this paper
Cite this paper
Muselli, M., Ruffino, F. (2005). Consistency of Empirical Risk Minimization for Unbounded Loss Functions. In: Apolloni, B., Marinaro, M., Tagliaferri, R. (eds) Biological and Artificial Intelligence Environments. Springer, Dordrecht. https://doi.org/10.1007/1-4020-3432-6_31
Download citation
DOI: https://doi.org/10.1007/1-4020-3432-6_31
Publisher Name: Springer, Dordrecht
Print ISBN: 978-1-4020-3431-2
Online ISBN: 978-1-4020-3432-9
eBook Packages: Computer ScienceComputer Science (R0)