Abstract
The problem of estimating time-varying regression inevitably concerns the necessity to choose the appropriate level of model volatility – ranging from the full stationarity of instant regression models to their absolute independence of each other. In the stationary case the number of regression coefficients, constituting the model parameter to be estimated, equals that of regressors, whereas the absence of any smoothness assumptions augments the dimension of the unknown vector by the factor of the time-series length. We consider here a family of continuously nested a priori probability distributions matching the specificity of time-varying data models, in which the dimension of the parameter is fixed, but the freedom of its values is softly constrained by a family of continuously nested a priori probability distributions, which contains a number of hyperparameters. The aim of this paper is threefold. First, in accordance with the specificity of the time-varying regression, we modify three commonly adopted methods of estimating hyperparameters in data models, namely, Leave-One-Out Cross Validation, Evidence Maximization and Hypothetical Cross Validation. Second, we experimentally compare these methods on both simulated and real-world data. Third, on the basis of the proposed technique we develop a new approach to the problem of detecting the hidden dynamics of an investment portfolio in respect to certain market or economic factors.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Rabiner, L.R.: A tutorial on hidden Markov models and selected applications to speech recognition. Proc. IEEE 77(2), 257–286 (1989)
Hastie, T., Tibshirani, R.: Varying-coefficient models. J. R. Stat. Soc. Ser. B (Methodol.) 55, 757–796 (1993)
Zhao, Z., et al.: Parametric and nonparametric models and methods in financial econometrics. Stat. Surv. 2, 1–42 (2008)
Park, B.U., Mammen, E., Lee, Y.K., Lee, E.R.: Varying coefficient regression models: a review and new developments. Int. Stat. Rev. 83(1), 36–64 (2015)
Fan, J., Zhang, W.: Statistical methods with varying coefficient models. Stat. Interface 1, 179–195 (2008)
Härdle, W., Hall, P., Marron, J.S.: How far are automatically chosen regression smoothing parameters from their optimum? J. Am. Stat. Assoc. 83, 86–101 (1998)
Mammen, E., Park, B.U.: Bandwidth selection for smooth backfitting in additive models. Ann. Stat. 33, 1260–1294 (2005)
Kalaba, R., Tesfatsion, L.: Time-varying linear regression via flexible least squares. Int. J. Comput. Math. Appl. 17, 1215–1245 (1989)
Markov, M., Krasotkina, O., Mottl, V., Muchnik, I.: Time-varying regression model with unknown time-volatility for nonstationary signal analysis. In: Proceedings of the 8th IASTED International Conference on Signal and Image Processing, Honolulu, 14–16 August 2006, paper 534-196 (2006)
Markov, M., Muchnik, I., Mottl, V., Krasotkina, O.: Dynamic analysis of hedge funds. In: Proceedings of the 8th IASTED International Conference on Financial Engineering and Applications, Massachusetts, 9–11 October 2006, paper 546-028. MIT, Cambridge (2006)
Mottl, V.V., Muchnik, I.B.: Bellman functions on trees for segmentation, generalized smoothing, matching and multi-alignment in massive data sets. DIMACS Technical report 98-15, February 1998. Rutgers University, the State University of New Jersey, 63 p. (1998)
Larson, S.: The shrinkage of the coefficient of multiple correlation. J. Educ. Psychol. 22(1), 45–55 (1931)
Stone, M.: Cross-validatory choice and assessment of statistical predictions. J. R. Stat. Soc. 36(2), 111–147 (1974)
Efron, B.: Estimating the error rate of a prediction rule: improvement on cross-validation. J. Am. Stat. Assoc. 78, 316–331 (1983)
MacKay, D.J.C.: Hyperparameters: optimize, or integrate out? In: Heidbreder, G.R. (ed.) Maximum Entropy and Bayesian Methods. Fundamental Theories of Physics, vol. 62, pp. 43–59. Springer, Dordrecht (1993)
Li, C., Mao, Y., Zhang, R., Huai, J.: On hyper-parameter estimation in empirical Bayes: a revisit of the MacKay algorithm. In: Proceedings of the Thirty-Second Conference on Uncertainty in Artificial Intelligence, Jersey City, 25–29 June 2016, pp. 477–486 (2016)
Bishop, C.: Pattern Recognition and Machine Learning. Springer, New York (2006)
Akaike, H.: A new look at the statistical model identification. IEEE Trans. Autom. Control 19, 716–723 (1974)
Kitagawa, G., Akaike, H.: A procedure for the modeling of no-stationary time series. Ann. Inst. Stat. Math. Part B 30, 351–363 (1987)
Rodrigues, C.C.: The ABC of model selection: AIC, BIC and new CIC. In: AIP Conference Proceedings, vol. 803, 23 November 2005, pp. 80–87 (2005)
Ezhova, E., Mottl, V., Krasotkina, O.: Estimation of time-varying linear regression with unknown time-volatility via continuous generalization of the Akaike Information Criterion. In: Proceedings of World Academy of Science, Engineering and Technology, March 2009, vol. 51, pp. 144–150 (2009)
Sharpe, W.F.: Asset allocation: management style and performance measurement. J. Portf. Manag. 7–19 (1992)
Sharpe, W.F.: Capital asset prices: a theory of market equilibrium under conditions of risk. J. Financ. 19(3), 425–442 (1964)
Fama, E.F., French, K.R.: The cross-section of expected stock returns. J. Financ. 47(2), 427–465 (1992)
Wieczner, J.: Ray Dalio’s McDonald’s-Inspired Hedge Fund Is Crushing His, Flagship Fund, 07 July 2016. http://fortune.com/2016/07/07/bridgewater-hedge-fund-ray-dalio/
Acknowledgement
We would like to acknowledge support from grants of the Russian Foundation for Basic Research 17-07-00436 and 17-07-00993.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2017 Springer International Publishing AG
About this paper
Cite this paper
Krasotkina, O., Mottl, V., Markov, M., Chernousova, E., Malakhov, D. (2017). Methods of Hyperparameter Estimation in Time-Varying Regression Models with Application to Dynamic Style Analysis of Investment Portfolios. In: Perner, P. (eds) Machine Learning and Data Mining in Pattern Recognition. MLDM 2017. Lecture Notes in Computer Science(), vol 10358. Springer, Cham. https://doi.org/10.1007/978-3-319-62416-7_31
Download citation
DOI: https://doi.org/10.1007/978-3-319-62416-7_31
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-62415-0
Online ISBN: 978-3-319-62416-7
eBook Packages: Computer ScienceComputer Science (R0)