[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to main content

Some Issues About the Generalization of Neural Networks for Time Series Prediction

  • Conference paper
Artificial Neural Networks: Formal Models and Their Applications – ICANN 2005 (ICANN 2005)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 3697))

Included in the following conference series:

Abstract

Some issues about the generalization of ANN training are investigated through experiments with several synthetic time series and real world time series. One commonly accepted view is that when the ratio of the training sample size to the number of weights is larger than 30, the overfitting will not occur. However, it is found that even with the ratio higher than 30, overfitting still exists. In cross-validated early stopping, the ratio of cross-validation data size to training data size has no significant impact on the testing error. For stationary time series, 10% may be a practical choice. Both Bayesian regularization method and the cross-validated early stopping method are helpful when the ratio of training sample size to the number of weights is less than 20. However, the performance of early stopping is highly variable. Bayesian method outperforms the early stopping method in most cases, and in some cases even outperforms no-stop training when the training data set is large.

An erratum to this chapter can be found at http://dx.doi.org/10.1007/11550907_163 .

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  1. Amari, S., Murata, N., Muller, K.-R., Finke, M., Yang, H.H.: Asymptotic statistical theory of overtraining and cross-validation. IEEE Transactions on Neural Networks 8(5), 985–996 (1997)

    Article  Google Scholar 

  2. Conway, A.J.: Time series, neural networks and the future of the Sun. New Astronomy Reviews 42, 343–394 (1998)

    Article  Google Scholar 

  3. De Oliveira, K.A., Vannucci, A., da Silva, E.C.: Using artificial neural networks to forecast chaotic time series. Physica A 284, 393–404 (2000)

    Article  Google Scholar 

  4. Foresee, F.D., Hagan, M.T.: Gauss-Newton approximation to Bayesian learning. In: Proceeding of the 1997 International Joint Conference on Neural Networks, pp. 1930–1935 (1997)

    Google Scholar 

  5. Geman, S., Bienenstock, E., Doursat, R.: Neural Networks and the Bias/Variance Dilemma. Neural Computation 4, 1–58 (1992)

    Article  Google Scholar 

  6. Henon, M.: A two-dimensional mapping with a strange attractor. Commun. Math. Phys. 50(1), 69–77 (1976)

    Article  MATH  MathSciNet  Google Scholar 

  7. Mackay, D.: Bayesian methods for adaptive models. PhD Thesis, California Institute of Technology (1991)

    Google Scholar 

  8. Mackey, M.C., Glass, L.: Oscillations and chaos in physiological control systems. Science 197, 287–289 (1977)

    Article  Google Scholar 

  9. Neal, R.M.: Bayesian Learning for Neural Networks. Springer, New York (1996)

    MATH  Google Scholar 

  10. Prechelt, L.: Early stopping – But when? In: Orr, G.B., Mueller, K.-R. (eds.) Neural Networks: Tricks of the Trade, pp. 55–69. Springer, Berlin (1998)

    Chapter  Google Scholar 

  11. Sarle, W.S.: Neural Network FAQ, part 3 of 7: Generalization (2002), ftp://ftp.sas.com/pub/neural/FAQ3.html

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2005 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Wang, W., Van Gelder, P.H.A.J.M., Vrijling, J.K. (2005). Some Issues About the Generalization of Neural Networks for Time Series Prediction. In: Duch, W., Kacprzyk, J., Oja, E., Zadrożny, S. (eds) Artificial Neural Networks: Formal Models and Their Applications – ICANN 2005. ICANN 2005. Lecture Notes in Computer Science, vol 3697. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11550907_88

Download citation

  • DOI: https://doi.org/10.1007/11550907_88

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-28755-1

  • Online ISBN: 978-3-540-28756-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics