[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
article
Free access

A Unifying View of Sparse Approximate Gaussian Process Regression

Published: 01 December 2005 Publication History

Abstract

We provide a new unifying view, including all existing proper probabilistic sparse approximations for Gaussian process regression. Our approach relies on expressing the effective prior which the methods are using. This allows new insights to be gained, and highlights the relationship between existing methods. It also allows for a clear theoretically justified ranking of the closeness of the known approximations to the corresponding full GPs. Finally we point directly to designs of new better sparse approximations, combining the best of the existing strategies, within attractive computational constraints.

References

[1]
Corinna Cortes and Vladimir Vapnik. Support-vector network. Machine Learning, 20(3):273-297, 1995.
[2]
Lehel Csató and Manfred Opper. Sparse online Gaussian processes. Neural Computation, 14(3): 641-669, 2002.
[3]
Sathiya Keerthi and Wei Chu. A Matching Pursuit approach to sparse Gaussian process regression. In Y. Weiss, B. Schölkopf, and J. Platt, editors, Advances in Neural Information Processing Systems 18, Cambridge, Massachussetts, 2006. The MIT Press.
[4]
Malte Kuss and Carl Edward Rasmussen. Assessing approximate inference for binary Gaussian process classification. Journal of Machine Learning Research, pages 1679-1704, 2005.
[5]
Joaquin Quiñonero-Candela. Learning with Uncertainty - Gaussian Processes and Relevance Vector Machines. PhD thesis, Technical University of Denmark, Lyngby, Denmark, 2004.
[6]
Carl Edward Rasmussen. Reduced rank Gaussian process learning. Technical report, Gatsby Computational Neuroscience Unit, UCL, 2002.
[7]
Carl Edward Rasmussen and Joaquin Quiñonero-Candela. Healing the relevance vector machine by augmentation. In International Conference on Machine Learning, 2005.
[8]
Carl Edward Rasmussen and Christopher K. I. Williams. Gaussian Processes for Machine Learning. The MIT press, 2006.
[9]
Anton Schwaighofer and Volker Tresp. Transductive and inductive methods for approximate Gaussian process regression. In Suzanna Becker, Sebastian Thrun, and Klaus Obermayer, editors, Advances in Neural Information Processing Systems 15, pages 953-960, Cambridge, Massachussetts, 2003. The MIT Press.
[10]
Matthias Seeger, Christopher K. I. Williams, and Neil Lawrence. Fast forward selection to speed up sparse Gaussian process regression. In Christopher M. Bishop and Brendan J. Frey, editors, Ninth International Workshop on Artificial Intelligence and Statistics. Society for Artificial Intelligence and Statistics, 2003.
[11]
Bernhard W. Silverman. Some aspects of the spline smoothing approach to non-parametric regression curve fitting. J. Roy. Stat. Soc. B, 47(1):1-52, 1985. (with discussion).
[12]
Alexander J. Smola and Peter L. Bartlett. Sparse greedy Gaussian process regression. In Todd K. Leen, Thomas G. Dietterich, and Volker Tresp, editors, Advances in Neural Information Processing Systems 13, pages 619-625, Cambridge, Massachussetts, 2001. The MIT Press.
[13]
Edward Snelson and Zoubin Ghahramani. Sparse Gaussian processes using pseudo-inputs. In Y.Weiss, B. Schölkopf, and J. Platt, editors, Advances in Neural Information Processing Systems 18, Cambridge, Massachussetts, 2006. The MIT Press.
[14]
Michael E. Tipping. Sparse Bayesian learning and the Relevance Vector Machine. Journal of Machine Learning Research, 1:211-244, 2001.
[15]
Volker Tresp. A Bayesian committee machine. Neural Computation, 12(11):2719-2741, 2000.
[16]
Vladimir N. Vapnik. The Nature of Statistical Learning Theory. Springer Verlag, 1995.
[17]
Grace Wahba, Xiwu Lin, Fangyu Gao, Dong Xiang, Ronald Klein, and Barbara Klein. The biasvariance tradeoff and the randomized GACV. In Michael S. Kerns, Sara A. Solla, and David A. Cohn, editors, Advances in Neural Information Processing Systems 11, pages 620-626, Cambridge, Massachussetts, 1999. The MIT Press.
[18]
Christopher K. I. Williams and Carl Edward Rasmussen. Gaussian processes for regression. In David S. Touretzky, Michael C. Mozer, and Michael E. Hasselmo, editors, Advances in Neural Information Processing Systems 8, pages 514-520, Cambridge, Massachussetts, 1996. The MIT Press.
[19]
Christopher K. I. Williams, Carl Edward Rasmussen, Anton Schwaighofer, and Volker Tresp. Observations of the Nyström method for Gaussiam process prediction. Technical report, University of Edinburgh, Edinburgh, Scotland, 2002.
[20]
Christopher K. I. Williams and Mathias Seeger. Using the Nyström method to speed up kernel machines. In Todd K. Leen, Thomas G. Dietterich, and Volker Tresp, editors, Advances in Neural Information Processing Systems 13, pages 682-688, Cambridge, Massachussetts, 2001. The MIT Press.
[21]
David Wipf, Jason Palmer, and Bhaskar Rao. Perspectives on sparse Bayesian learning. In Sebastian Thrun, Lawrence Saul, and Bernhard Schölkopf, editors, Advances in Neural Information Processing Systems 16, Cambridge, Massachussetts, 2004. The MIT Press.

Cited By

View all
  • (2024)Adaptive Robotic Information Gathering via non-stationary Gaussian processesInternational Journal of Robotics Research10.1177/0278364923118449843:4(405-436)Online publication date: 1-Apr-2024
  • (2024)Gaussian Process-Gated Hierarchical Mixtures of ExpertsIEEE Transactions on Pattern Analysis and Machine Intelligence10.1109/TPAMI.2024.338193646:9(6443-6453)Online publication date: 1-Sep-2024
  • (2024)Optimal Composite Likelihood Estimation and Prediction for Distributed Gaussian Process ModelingIEEE Transactions on Pattern Analysis and Machine Intelligence10.1109/TPAMI.2023.332837846:2(1134-1147)Online publication date: 1-Feb-2024
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image The Journal of Machine Learning Research
The Journal of Machine Learning Research  Volume 6, Issue
12/1/2005
2169 pages
ISSN:1532-4435
EISSN:1533-7928
Issue’s Table of Contents

Publisher

JMLR.org

Publication History

Published: 01 December 2005
Published in JMLR Volume 6

Qualifiers

  • Article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)172
  • Downloads (Last 6 weeks)25
Reflects downloads up to 30 Dec 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Adaptive Robotic Information Gathering via non-stationary Gaussian processesInternational Journal of Robotics Research10.1177/0278364923118449843:4(405-436)Online publication date: 1-Apr-2024
  • (2024)Gaussian Process-Gated Hierarchical Mixtures of ExpertsIEEE Transactions on Pattern Analysis and Machine Intelligence10.1109/TPAMI.2024.338193646:9(6443-6453)Online publication date: 1-Sep-2024
  • (2024)Optimal Composite Likelihood Estimation and Prediction for Distributed Gaussian Process ModelingIEEE Transactions on Pattern Analysis and Machine Intelligence10.1109/TPAMI.2023.332837846:2(1134-1147)Online publication date: 1-Feb-2024
  • (2024)Implementation and analysis of GPU algorithms for Vecchia ApproximationStatistics and Computing10.1007/s11222-024-10510-934:6Online publication date: 1-Dec-2024
  • (2024)Fast deep mixtures of Gaussian process expertsMachine Language10.1007/s10994-023-06491-x113:3(1483-1508)Online publication date: 1-Mar-2024
  • (2024)Scalable Bayesian optimization with generalized product of expertsJournal of Global Optimization10.1007/s10898-022-01236-x88:3(777-802)Online publication date: 1-Mar-2024
  • (2024)Data-driven prediction of keyhole features in metal additive manufacturing based on physics-based simulationJournal of Intelligent Manufacturing10.1007/s10845-023-02157-635:5(2313-2326)Online publication date: 1-Jun-2024
  • (2024)Machine learning enabled optimization of showerhead design for semiconductor deposition processJournal of Intelligent Manufacturing10.1007/s10845-023-02082-835:2(925-935)Online publication date: 1-Feb-2024
  • (2024)Combination of optimization-free kriging models for high-dimensional problemsComputational Statistics10.1007/s00180-023-01424-739:6(3049-3071)Online publication date: 1-Sep-2024
  • (2024)A review on computer model calibrationWIREs Computational Statistics10.1002/wics.164516:1Online publication date: 21-Jan-2024
  • Show More Cited By

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Login options

Full Access

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media