[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
article

Model guided adaptive design and analysis in computer experiment

Published: 01 October 2012 Publication History

Abstract

Computer experiments have become increasingly important in several different industries. These experiments save resources by exploring different designs without necessitating real hardware manufacturing. However, computer experiments usually require lengthy simulation times and powerful computational capacity. Therefore, it is often pragmatically impossible to run experiments on a complete design space. In this paper, we propose an adaptive sampling scheme that interactively works with predictive models to sequentially select design points for computer experiments. The selected samples are used to build predictive models, which in turn guide further sampling and predict the entire design space. For illustration, we use Bayesian additive regression trees (BART), multiple additive regression trees (MART), treed Gaussian process and Gaussian process to guide the proposed sampling method. Both real data and simulation studies show that our sampling method is effective in that (i) it can be used with different predictive models; (ii) it can select multiple design points without repeatedly refitting the predictive models, which makes parallel simulations possible and (iii) the predictive model built on its generated samples gives more accurate predictions on the unsampled points than the models built on samples from other methods such as random sampling, space-filling designs and some adaptive sampling methods. © 2012 Wiley Periodicals, Inc. Statistical Analysis and Data Mining, 2012 © 2012 Wiley Periodicals, Inc.

References

[1]
D. R. Jones, M. Schonlau, and W. J. Welch, Efficient global optimization of expensive black-box functions, J Glob Optim 13 (1998), 455–492.
[2]
E. İpek, S. A. McKee, B. R. Supinski, M. Schulz, and R. Caruana, Efficiently exploring architectural design spaces via predictive modeling, In Twelfth International Conference on Architectural Support for Programming Languages and Operating Systems (ASPLOS XII), San Jose, CA, 2006.
[3]
B. Lee and D. Brooks, Accurate and efficient regression modeling for microarchitectural performance and power prediction, In Twelfth International Conference on Architectural Support for Programming Languages and Operating Systems (ASPLOS XII), San Jose, CA, 2006.
[4]
P. Joseph, K. Vaswani, and M. Thazhuthaveetil, Use of linear regression models for processor performance analysis, In Proceedings of 12th IEEE Symposium on High Performance Computer Architecture (HPCA-12), 2006, 99–108.
[5]
S. Seo, M. Wallat, T. Graepel, and K. Obermayer, Gaussian process regression: active data selection and test point rejection, In Proceedings of the International Joint Conference on Neural Networks (IJCNN), IEEE, 2000, 241–246.
[6]
Q. Yu, B. Li, Z. Fang, and L. Peng, An adaptive sampling scheme guided by BART - with an application to predict processor performance, Can J Stat 38(1) (2010), 136–152.
[7]
J. Sacks, W. J. Welch, T. J. Mitchell, and H. P. Wynn, Design and analysis of computer experiments, Stat Sci 4 (1989), 409–435.
[8]
T. J. Santner, B. J. Williams, and W. I. Notz, The Design and Analysis of Computer Experiments, New York, Springer Verlag, 2003.
[9]
D. J. C. Mackay, Information-based objective functions for active data selection, Neural Comput 4(4) (1992), 589–603.
[10]
D. A. Cohn, Neural network exploration using optimal experiment design, Neural Netw 9(6) (1994), 1071–1083.
[11]
P. Kim, and Y. Ding, Optimal engineering system design guided by data-mining methods, Technometrics, 47(3) (2005), 336–348.
[12]
R. B. Gramacy, and H. K. H. Lee, Bayesian treed Gaussian process models with an application to computer modeling, J Am St Assoc 103 (2008), 1119–1130.
[13]
R. B. Gramacy and H. K. H. Lee, Adaptive design and analysis of supercomputer experiments, Technometrics 51 (2009), 130–145.
[14]
Y. Freund, H. S. Seung, E. Shamir, and N. Tishby, Information, prediction, and query by committee, In Proceedings of Advances in Neural Information Processing Systems, 1993, 483–490.
[15]
K. Sung and P. Niyogi, Active learning for function approximation, Proc Adv Neural Inf Process Syst 7 (1995), 593–600.
[16]
M. Saar-Tsechansky, and F. Provost, Active learning for class probability estimation and ranking, In Proceedings of 17th International Joint Conference on Artificial Intelligence, 2001, 911–920.
[17]
B. Li, L. Peng, and B. Ramadass, Efficient MART-aided modeling for microarchitecture design space exploration and performance prediction, In 2008 ACM International Conference on Measurement and Modeling of Computer Systems (SIGMETRICS 2008), Annapolis, MD, 2008.
[18]
J. E. Oakley and A. O'Hagan, Probabilistic sensitivity analysis of complex models: a Bayesian approach, J R Stat Soc {Ser B} 66 (2004), 751–769.
[19]
B. Efron and R. J. Tibshirani, An Introduction to the Bootstrap, Boca Raton, FL, CRC Press, 1994.
[20]
H. A. Chipman, E. I. George, and R. E. McCulloch, BART: Bayesian additive regression trees, Ann Appl Stat 4(1) (2010), 266–298.
[21]
X. Zhang, T. Y. Shih, and P. Muller, A spatially-adjusted Bayesian additive regression tree model to merge two datasets, Bayesian Anal 2(3) (2007), 611–634.
[22]
Q. Yu, S. N. MacEachern, and M. Peruggia, Bayesian synthesis: Combining subjective analysis, with an application to ozone data. The Ann Appl Stat 5(2B) (2011), 1678–1698.
[23]
J. H. Friedman, Greedy function approximation: a gradient boosting machine, Ann Stat 29 (2001), 1189–1232.
[24]
R. B. Gramacy, tgp: An R package for Bayesian nonstationary, semiparametric nonlinear regression and design by treed Gaussian process models, J Stat Softw 19(9) (2007), 1–46.

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image Statistical Analysis and Data Mining
Statistical Analysis and Data Mining  Volume 5, Issue 5
October 2012
114 pages
ISSN:1932-1864
EISSN:1932-1872
Issue’s Table of Contents

Publisher

John Wiley & Sons, Inc.

United States

Publication History

Published: 01 October 2012

Author Tags

  1. Bayesian additive regression trees
  2. Gaussian process
  3. adaptive design
  4. computer experiment
  5. sequential sampling

Qualifiers

  • Article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 0
    Total Downloads
  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 16 Dec 2024

Other Metrics

Citations

View Options

View options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media