[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to main content

Standard Error Estimation in Neural Network Regression Models: the AR-Sieve Bootstrap Approach

  • Conference paper
Neural Nets WIRN Vietri-01

Part of the book series: Perspectives in Neural Computing ((PERSPECT.NEURAL))

  • 810 Accesses

Abstract

In this paper we investigate the usage of the AR-Sieve bootstrap method to estimate the standard error of the sampling distribution of the neural network predictive values in a regression model with dependent errors. The performance of the proposed approach is evaluated by a Monte Carlo experiment where it is also compared with the classical residual bootstrap scheme.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
£29.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
GBP 19.95
Price includes VAT (United Kingdom)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
GBP 35.99
Price includes VAT (United Kingdom)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
GBP 44.99
Price includes VAT (United Kingdom)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  1. P. Buhlmann. Sieve bootstrap for smoothing in nonstationary time series. The Annals of Statistics, 26:48–83, 1998.

    Article  MathSciNet  Google Scholar 

  2. P. Buhlmann. Bootstrap for time series. Research report 87, ETH, Zurich, 1999.

    Google Scholar 

  3. K. Hornik, M. Stinchcombe, and H. White. Multy-layer feedforward networks are universal approximators. Neural Networks, 2:359–366, 1989.

    Article  Google Scholar 

  4. A. P. N. Refenes and A. D. Zapranis. Neural model identification, variable selection and model adequacy. Journal of Forecasting, 18:299–332, 1999.

    Article  Google Scholar 

  5. R. Tibshirani. A comparison of some error estimates for neural network models. Neural Computation, 8:152–163, 1996.

    Article  Google Scholar 

  6. H. White. Learning in artificial neural networks: a statistical prespective. Neural Computation, 1:425–464, 1989.

    Article  Google Scholar 

  7. H. White and W. G. Baxt. Bootstrapping confidence intervals for clinical input variable effects in a network trained to identify the presence of acute myocardial infraction. Neural Computation, 7:624–638, 1995.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Copyright information

© 2002 Springer-Verlag London Limited

About this paper

Cite this paper

Giordano, F., La Rocca, M., Perna, C. (2002). Standard Error Estimation in Neural Network Regression Models: the AR-Sieve Bootstrap Approach. In: Tagliaferri, R., Marinaro, M. (eds) Neural Nets WIRN Vietri-01. Perspectives in Neural Computing. Springer, London. https://doi.org/10.1007/978-1-4471-0219-9_21

Download citation

  • DOI: https://doi.org/10.1007/978-1-4471-0219-9_21

  • Publisher Name: Springer, London

  • Print ISBN: 978-1-85233-505-2

  • Online ISBN: 978-1-4471-0219-9

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics