[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/1273496.1273552acmotherconferencesArticle/Chapter ViewAbstractPublication PagesicmlConference Proceedingsconference-collections
Article

Kernelizing PLS, degrees of freedom, and efficient model selection

Published: 20 June 2007 Publication History

Abstract

Kernelizing partial least squares (PLS), an algorithm which has been particularly popular in chemometrics, leads to kernel PLS which has several interesting properties, including a sub-cubic runtime for learning, and an iterative construction of directions which are relevant for predicting the outputs. We show that the kernelization of PLS introduces interesting properties not found in ordinary PLS, giving novel insights into the workings of kernel PLS and the connections to kernel ridge regression and conjugate gradient descent methods. Furthermore, we show how to correctly define the degrees of freedom for kernel PLS and how to efficiently compute an unbiased estimate. Finally, we address the practical problem of model selection. We demonstrate how to use the degrees of freedom estimate to perform effective model selection, and discuss how to implement crossvalidation schemes efficiently.

References

[1]
Braun, M. L., Buhmann, J. M., & Müüller, K.-R. (2007). Denoising and Dimension Reduction in Feature Space. Advances in Neural Information Processing Systems 19. Cambridge, MA: MIT Press.
[2]
Efron, B. (2004). The Estimation of Prediction Error: Covariance Penalties and Cross-Validation. Journal of the American Statistical Association, 99, 619--633.
[3]
Frank, I., & Friedman, J. (1993). A Statistical View of Some Chemometrics Regression Tools. Technometrics, 35, 109--135.
[4]
Golub, G., & Van Loan, C. (1996). Matrix Computations. Johns Hopkins University Press.
[5]
Hansen, M., & Yu, B. (2001). Model Selection and Minimum Descripion Length Principle. Journal of the American Statistical Association, 96, 746--774.
[6]
Hastie, T., & Tibshirani, R. (1990). Generalized Additive Models. Chapman and Hall, London.
[7]
Helland, I. (1988). On the Structure of Partial Least Squares Regression. Communications in Statistics, Simulation and Computation, 17, 581--607.
[8]
Lanczos, C. (1950). An Iteration Method for the Solution of the Eigenvalue Problem of Linear Differential and Integral Operators. Journal of Research of the National Bureau of Standards, 45, 225--280.
[9]
Lingjærde, O., & Christophersen, N. (2000). Shrinkage Structure of Partial Least Squares. Scandinavian Journal of Statistics, 27, 459--473.
[10]
Manne, R. (1987). Analysis of Two Partial-Least-Squares Algorithms for Multivariate Calibration. Chemometrics and Intelligent Laboratory Systems, 2, 187--197.
[11]
Phatak, A., & de Hoog, F. (2002). Exploiting the Connection between PLS, Lanczos Methods and Conjugate Gradients: Alternative Proofs of Some Properties of PLS. Journal of Chemometrics, 16, 361--367.
[12]
Phatak, A., Rilley, P., & Penlidis, A. (2002). The Asymptotic Variance of the Univariate PLS Estimator. Linear Algebra and its Applications, 354, 245--253.
[13]
Rännar, S., Lindgren, F., Geladi, P., & Wold, S. (1994). A PLS Kernel Algorithm for Data Sets with many Variables and Fewer Objects, Part I: Theory and Applications. Journal of Chemometrics, 8, 111--125.
[14]
Rosipal, R., & Kräämer, N. (2006). Overview and Recent Advances in Partial Least Squares. In Subspace, latent structure and feature selection techniques, Lecture Notes in Computer Science, 34--51. Springer.
[15]
Rosipal, R., & Trejo, L. (2001). Kernel Partial Least Squares Regression in Reproducing Kernel Hilbert Spaces. Journal of Machine Learning Research, 2, 97--123.
[16]
Rosipal, R., Trejo, L., & Matthews, B. (2003). Kernel PLS-SVC for Linear and Nonlinear Classification. Proceedings of the Twentieth International Conference on Machine Learning (pp. 640--647). Washington, DC.
[17]
Serneels, S., Lemberge, P., & Espen, P. V. (2004). Calculation of PLS Prediction Intervals Using Efficient Recursive Relations for the Jacobian Matrix. Journal of Chemometrics, 18, 76--80.
[18]
Wahba, G. (1990). Spline Models For Observational Data. Society for Industrial and Applied Mathematics.
[19]
Wold, H. (1975). Path models with Latent Variables: The NIPALS Approach. In et H. B. al. (Ed.), Quantitative Sociology: International Perspectives on Mathematical and Statistical Model Building, 307--357. Academic Press.

Cited By

View all
  • (2024)Dynamic multilayer functional connectivity detects preclinical and clinical Alzheimer’s diseaseCerebral Cortex10.1093/cercor/bhad54234:2Online publication date: 11-Jan-2024
  • (2021)Monte Carlo methods for estimating Mallows's Cp and AIC criteria for PLSR models. Illustration on agronomic spectroscopic NIR dataJournal of Chemometrics10.1002/cem.336935:10Online publication date: 25-Aug-2021
  • (2020)Approximate kernel partial least squaresAnnals of Mathematics and Artificial Intelligence10.1007/s10472-020-09694-3Online publication date: 27-Mar-2020
  • Show More Cited By
  1. Kernelizing PLS, degrees of freedom, and efficient model selection

      Recommendations

      Comments

      Please enable JavaScript to view thecomments powered by Disqus.

      Information & Contributors

      Information

      Published In

      cover image ACM Other conferences
      ICML '07: Proceedings of the 24th international conference on Machine learning
      June 2007
      1233 pages
      ISBN:9781595937933
      DOI:10.1145/1273496
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Sponsors

      • Machine Learning Journal

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 20 June 2007

      Permissions

      Request permissions for this article.

      Check for updates

      Qualifiers

      • Article

      Conference

      ICML '07 & ILP '07
      Sponsor:

      Acceptance Rates

      Overall Acceptance Rate 140 of 548 submissions, 26%

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)12
      • Downloads (Last 6 weeks)2
      Reflects downloads up to 12 Dec 2024

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)Dynamic multilayer functional connectivity detects preclinical and clinical Alzheimer’s diseaseCerebral Cortex10.1093/cercor/bhad54234:2Online publication date: 11-Jan-2024
      • (2021)Monte Carlo methods for estimating Mallows's Cp and AIC criteria for PLSR models. Illustration on agronomic spectroscopic NIR dataJournal of Chemometrics10.1002/cem.336935:10Online publication date: 25-Aug-2021
      • (2020)Approximate kernel partial least squaresAnnals of Mathematics and Artificial Intelligence10.1007/s10472-020-09694-3Online publication date: 27-Mar-2020
      • (2017)Kernel partial least squares for stationary dataThe Journal of Machine Learning Research10.5555/3122009.317686718:1(4447-4487)Online publication date: 1-Jan-2017
      • (2016)Discriminant sparse label-sensitive embeddingEngineering Applications of Artificial Intelligence10.1016/j.engappai.2016.01.03550:C(168-176)Online publication date: 1-Apr-2016
      • (2012)Near Optimal Prediction from Relevant ComponentsScandinavian Journal of Statistics10.1111/j.1467-9469.2011.00770.x39:4(695-713)Online publication date: 7-Mar-2012
      • (2009)Regularized estimation of large-scale gene association networks using graphical Gaussian modelsBMC Bioinformatics10.1186/1471-2105-10-38410:1Online publication date: 24-Nov-2009
      • (2009)Machine Learning Techniques for the Analysis of Magnetic Flux Leakage Images in Pipeline InspectionIEEE Transactions on Magnetics10.1109/TMAG.2009.202016045:8(3073-3084)Online publication date: Aug-2009
      • (2009)Relief wrapper based Kernel Partial Least Squares subspace selection2009 2nd IEEE International Conference on Computer Science and Information Technology10.1109/ICCSIT.2009.5234751(44-48)Online publication date: Aug-2009
      • (2008)Partial least squares regression for graph miningProceedings of the 14th ACM SIGKDD international conference on Knowledge discovery and data mining10.1145/1401890.1401961(578-586)Online publication date: 24-Aug-2008
      • Show More Cited By

      View Options

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media