Abstract
We follow the idea of decomposing a large data set into smaller groups, and present a novel distance-based method of selecting potential support vectors in each group by means of kernel matrix. Potential support vectors selected in the previous group are passed to the next group for further selection. Quadratic programming is performed only once, on the potential support vectors still retained in the last group, for the construction of an optimal hyperplane. We avoid solving unnecessary quadratic programming problems at intermediate stages, and can take control over the number of selected potential support vectors to cope with the limitations of memory capacity and existing optimizers’ capability. Since this distance-based method does not work on data containing outliers and noises, we introduce the idea of separating outliers/noises and the base, by use of the k-nearest neighbor algorithm, to improve generalization ability. Two optimal hyperplanes are constructed on the base part and the outlier/noise part, respectively, which are then synthesized to derive the optimal hyperplane on the overall data.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Vapnik, V.N.: The Nature of Statistical Learning Theory, 2nd edn. Springer, Heidelberg (1999)
Vapnik, V.N.: Estimation of Dependencies Based on Empirical Data. Springer, Heidelberg (1982)
Osuna, E.E., Freund, R., Girosi, F.: Support vector machines: Training and applications. A.I.Memo 1602, Massachusetts Institute of Technology (1997)
Platt, J.C.: Fast training of support vector machines using sequential minimal optimization. In: Scholköpf, B., Burges, C.J.C., Smola, A.J. (eds.) Advances in Kernel Methods: Support Vector Learning, The MIT Press, Cambridge (1998)
Scholköpf, B.: The kernel trick for distances. In: Leen, T., Dietterich, T., Tresp, V. (eds.) Advances in Neural Information Processing Systems, vol. 13, The MIT Press, Cambridge (2000)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2004 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Li, B. (2004). Distance-Based Selection of Potential Support Vectors by Kernel Matrix. In: Yin, FL., Wang, J., Guo, C. (eds) Advances in Neural Networks – ISNN 2004. ISNN 2004. Lecture Notes in Computer Science, vol 3173. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-28647-9_78
Download citation
DOI: https://doi.org/10.1007/978-3-540-28647-9_78
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-22841-7
Online ISBN: 978-3-540-28647-9
eBook Packages: Springer Book Archive