[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ Skip to main content
Log in

Kernel \(\ell ^1\)-norm principal component analysis for denoising

  • Original Paper
  • Published:
Optimization Letters Aims and scope Submit manuscript

Abstract

In this paper we describe a method for denoising data using kernel principal component analysis (KPCA) that is able to recover preimages of the intrinsic variables in the feature space using a single line search along the gradient descent direction of its squared projection error. This method combines a projection-free preimage estimation algorithm with an \(\ell ^1\)-norm KPCA. These two stages provide distinct advantages over other KPCA preimage methods in the sense that they are insensitive to outliers and computationally efficient. The method can improve the results of a range of unsupervised learning tasks, such as denoising, and clustering. Numerical experiments in the Amsterdam Library of Object Images demonstrate that the proposed method performs better in terms of mean squared error than the \(\ell ^2\)-norm analogue, as well as in synthetic data. The proposed method is applied to different datasets and the results are reported.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
£29.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price includes VAT (United Kingdom)

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

Data availability

The image datasets generated during and/or analysed during the current study are available in the Amsterdam Library of Object Images repository at https://aloi.science.uva.nl.

References

  1. Brooks, J.P., Dulá, J.H., Boone, E.L.: A pure \(L_1\)-norm principal component analysis. Comput. Stat. Data Anal. 61, 83–98 (2013)

    Article  Google Scholar 

  2. Markopoulos, P.P., Dhanaraj, M., Savakis, A.: Adaptive l1-norm principal-component analysis with online outlier rejection. IEEE J. Sel. Top. Sign. Process. 12(6), 1131–1143 (2018)

    Article  Google Scholar 

  3. Candès, E., Li, X., Ma, Y., Wright, J.: Robust principal component analysis? J. ACM 58(3), 1–37 (2011)

    Article  MathSciNet  Google Scholar 

  4. Chierichetti, F., Kumar, R., Raghavan, P., Sarlos, T.: Are web users really markovian? In: Proceedings of the 21st International Conference on World Wide Web, pp. 609–618 (2012)

  5. Kwak, N.: Principal component analysis based on l1-norm maximization. IEEE Trans. Patt. Anal. Mach. Intell. 30(9), 1672–1680 (2008)

    Article  Google Scholar 

  6. Paluš, M., Dvořák, I.: Singular-value decomposition in attractor reconstruction: pitfalls and precautions. Phys. D: Nonlinear Phenom. 55(1–2), 221–234 (1992)

    Article  MathSciNet  Google Scholar 

  7. Xu, L., Oja, E., Suen, C.Y.: Modified Hebbian learning for curve and surface fitting. Neural Netw. 5(3), 441–457 (1992)

    Article  Google Scholar 

  8. Bui, A.T., Im, J.-K., Apley, D.W., Runger, G.C.: Projection-free kernel principal component analysis for denoising. Neurocomputing 357, 163–176 (2019)

    Article  Google Scholar 

  9. Schölkopf, B., Mika, S., Burges, C.J., Knirsch, P., Muller, K.-R., Ratsch, G., Smola, A.J.: Input space versus feature space in kernel-based methods. IEEE Trans. Neural Netw. 10(5), 1000–1017 (1999)

    Article  Google Scholar 

  10. Im, J.-K., Apley, D.W., Runger, G.C.: Tangent hyperplane kernel principal component analysis for denoising. IEEE Trans. Neural Netw. Learn. Syst. 23(4), 644–656 (2012)

    Article  Google Scholar 

  11. Nguyen, M., Torre, F.: Robust kernel principal component analysis. Adv. Neural Inf. Process. Syst. 21, 2 (2008)

    Google Scholar 

  12. Kim, C., Klabjan, D.: A simple and fast algorithm for \(\ell _1\)-norm kernel PCA. IEEE Trans. Patt. Anal. Mach. Intell. 42(8), 1842–1855 (2019)

    Article  Google Scholar 

  13. Boser, B., Guyon, I., Vapnik, V.: A training algorithm for optimal margin classifiers, pp. 144–152 (1992)

  14. Vapnik, V.: The nature of statistical learning theory (1999)

  15. Schölkopf, B., Smola, A., Müller, K.-R.: Kernel principal component analysis. In: International Conference on Artificial Neural Networks, pp. 583–588, Springer (1997)

  16. Schölkopf, B., Mika, S., Smola, A., Rätsch, G., Müller, K.-R.: Kernel PCA pattern reconstruction via approximate pre-images. In: International Conference on Artificial Neural Networks, pp. 147–152, Springer (1998)

  17. Kwak, N.: Nonlinear projection trick in kernel methods: an alternative to the kernel trick. IEEE Trans. Neural Netw. Learn. Syst. 24(12), 2113–2119 (2013)

    Article  Google Scholar 

  18. Geusebroek, J.-M., Burghouts, G.J., Smeulders, A.W.: The Amsterdam library of object images. Int. J. Comput. Vis. 61(1), 103–112 (2005)

    Article  Google Scholar 

Download references

Acknowledgements

The authors would like to express their sincere gratitude to anonymous reviewers for their valuable insights. Algorithm 1 is implemented in C using key functionality in the Intel oneAPI Math Kernel Library to achieve performance on Intel CPU architectures. The code is openly accessible and can be forked from the repository at https://github.com/lingxpca/kl1pca.git.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Xiao Ling.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Ling, X., Bui, A. & Brooks, P. Kernel \(\ell ^1\)-norm principal component analysis for denoising. Optim Lett 18, 2133–2148 (2024). https://doi.org/10.1007/s11590-023-02051-3

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11590-023-02051-3

Keywords

Navigation