Abstract
Trend estimation, i.e. estimating or smoothing a nonlinear function without any independent variables, belongs to important tasks in various applications within signal and image processing, engineering, biomedicine, analysis of economic time series, etc. We are interested in estimating trend under the presence of heteroscedastic errors in the model. So far, there seem no available studies of the performance of robust neural networks or the taut string (stretched string) algorithm under heteroscedasticity. We consider here the Aitken-type model, analogous to known models for linear regression, which take heteroscedasticity into account. Numerical studies with heteroscedastic data possibly contaminated by outliers yield improved results, if the Aitken model is used. The results of robust neural networks turn out to be especially favorable in our examples. On the other hand, the taut string (and especially its robust \(L_1\)-version) inclines to overfitting and suffers from heteroscedasticity.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Atkinson, A.C., Riani, M., Torti, F.: Robust methods for heteroskedastic regression. Comput. Stat. Data Anal. 104, 209–222 (2016)
Brockwell, P.J., Davis, R.A.: Introduction to Time Series and Forecasting. STS, 2nd edn. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-29854-2
Davies, P.L.: Data Analysis and Approximate Models. CRC Press, Boca Raton (2014)
Davies, P.L., Kovac, A.: Local extremes, runs, strings and multiresolution. Ann. Statist. 29, 1–65 (2001)
Davies, L., Kovac, A.: Ftnonpar: features and strings for nonparametric regression. R package version 0.1-88 (2019). https://CRAN.R-project.org/package=ftnonpar
Dümgen, L., Kovac, A.: Extensions of smoothing via taut strings. Electron. J. Stat. 3, 41–75 (2009)
Greene, W.H.: Econometric Analysis, 8th edn. Pearson, London (2017)
Haykin, S.O.: Neural Networks and Learning Machines: A Comprehensive Foundation, 2nd edn. Prentice Hall, Upper Saddle River (2009)
Jurečková, J., Picek, J., Schindler, M.: Robust Statistical Methods with R, 2nd edn. CRC Press, Boca Raton (2019)
Kalina, J., Schlenker, A.: A robust supervised variable selection for noisy high-dimensional data. BioMed Res. Int. 2015, Article 320385 (2015)
Kalina, J., Tichavský, J.: On robust estimation of error variance in (highly) robust regression. Meas. Sci. Rev. 20, 6–14 (2020)
Kalina, J., Vidnerová, P.: Robust training of radial basis function neural networks. In: Rutkowski, L., Scherer, R., Korytkowski, M., Pedrycz, W., Tadeusiewicz, R., Zurada, J.M. (eds.) ICAISC 2019. LNCS (LNAI), vol. 11508, pp. 113–124. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-20912-4_11
Kim, S., Pokojovy, M., Wan, X.: The taut string approach to statistical inverse problems: theory and applications. J. Comput. Appl. Math. 382, Article 113098 (2021)
Koenker, R., Mizera, I.: The alter egos of the regularized maximum likelihood density estimators: Deregularized maximum-entropy, Shannon, Rényi, Simpson, Gini, and stretched strings. In: Hušková, M., Janžura, M. (eds.) Prague Stochastics, pp. 145–157. Matfyzpress, Prague (2006)
Makovetskii, A., Voronin, S., Kober, V., Voronin, A.: Tube-based taut string algorithms for total variation regularization. Mathematics 8, Article 1141 (2020)
Ng, N.H., Gabriel, R.A., McAuley, J., Elkan, C., Lipton, Z.C.: Predicting surgery duration with neural heteroscecastic regression. Proc. Mach. Learn. Res. 68(26), 100–111 (2017)
Overgaard, N.C.: On the taut string interpretation and other properties of the Rudin-Osher-Fatemi model in one dimension. J. Math. Imaging Vis. 61, 1276–1300 (2019)
Paliwal, M., Kumar, U.A.: The predictive accuracy of feed forward neural networks and multiple regression in the case of heteroscedastic data. Appl. Soft Comput. 11, 3859–3869 (2011)
Paul, C., Vishwakarma, G.K.: Back propagation neural networks and multiple regressions in the case of heteroscedasticity. Comm. Stat. Simul. Comput. 46, 6772–6789 (2017)
R Core Team: R: A language and environment for statistical computing. R Foundation for Statistical Computing, Vienna (2019). https://www.R-project.org/
Rousseeuw, P.J., Leroy, A.M.: Robust Regression and Outlier Detection. Wiley, New York (1987)
Rousseeuw, P.J., Van Driessen, K.: Computing LTS regression for large data sets. Data Min. Knowl. Disc. 12, 29–45 (2006)
Rusiecki, A.: Robust LTS backpropagation learning algorithm. In: Sandoval, F., Prieto, A., Cabestany, J., Graña, M. (eds.) IWANN 2007. LNCS, vol. 4507, pp. 102–109. Springer, Heidelberg (2007). https://doi.org/10.1007/978-3-540-73007-1_13
Víšek, J.Á.: The least trimmed squares. Part I: consistency. ybernetika 42, 1–36 (2006)
Víšek, J.Á.: Consistency of the least weighted squares under heteroscedasticity. Kybernetika 47, 179–206 (2011)
Acknowledgements
The research is supported by the projects GA19-05704S and GA18-23827S of the Czech Science Foundation. Jiří Tumpach and Patrik Janáček provided technical support.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 Springer Nature Switzerland AG
About this paper
Cite this paper
Kalina, J., Vidnerová, P., Tichavský, J. (2021). A Comparison of Trend Estimators Under Heteroscedasticity. In: Rutkowski, L., Scherer, R., Korytkowski, M., Pedrycz, W., Tadeusiewicz, R., Zurada, J.M. (eds) Artificial Intelligence and Soft Computing. ICAISC 2021. Lecture Notes in Computer Science(), vol 12854. Springer, Cham. https://doi.org/10.1007/978-3-030-87986-0_8
Download citation
DOI: https://doi.org/10.1007/978-3-030-87986-0_8
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-87985-3
Online ISBN: 978-3-030-87986-0
eBook Packages: Computer ScienceComputer Science (R0)