[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
nep-ets New Economics Papers
on Econometric Time Series
Issue of 2011‒12‒13
thirteen papers chosen by
Yong Yin
SUNY at Buffalo

  1. One-Sided Representations of Generalized Dynamic Factor Models By Mario Forni; Marc Hallin; Marco Lippi; Paolo Zaffaroni
  2. Some critical remarks on Zhang's gamma test for independence By Klein, Ingo; Tinkl, Fabian
  3. Comparison of Bayesian Model Selection Criteria and Conditional Kolmogorov Test as Applied to Spot Asset Pricing Models By Xiangjin Shen; Hiroki Tsurumi
  4. Methods for Computing Marginal Data Densities from the Gibbs Output By Cristina Fuentes-Albero; Leonardo Melosi
  5. Seeing Inside the Black Box: Using Diffusion Index Methodology to Construct Factor Proxies in Largescale Macroeconomic Time Series Environments By Norman R. Swanson; Nii Ayi Armah
  6. Predictive Inference for Integrated Volatility By Norman R. Swanson; Valentina Corradi; Walter Distaso
  7. In- and Out-of-Sample Specification Analysis of Spot Rate Models: Further Evidence for the Period 1982-2008 By Norman R. Swanson; Lili Cai
  8. Predictive Density Construction and Accuracy Testing with Multiple Possibly Misspecified Diffusion Models By Norman R. Swanson; Valentina Corradi
  9. Asymptotic theory for iterated one-step Huber-skip estimators By Søren Johansen; Bent Nielsen
  10. Spectral estimation of covolatility from noisy observations using local weights By Markus Bibinger; Markus Reiß
  11. Approximated maximum likelihood estimation in multifractal random walks By Ola L{\o}vsletten; Martin Rypdal
  12. Markov Chains application to the financial-economic time series prediction By Vladimir Soloviev; Vladimir Saptsin; Dmitry Chabanenko
  13. Common persistence in conditional variance: A reconsideration By Chang-Shuai Li

  1. By: Mario Forni (Università di Modena e Reggio Emilia, CEPR and RECent); Marc Hallin (ECARES, Université Libre de Bruxelles and ORFE, Princeton University); Marco Lippi (Università di Roma "La Sapienza" and EIEF); Paolo Zaffaroni (Imperial College London and Università di Roma "La Sapienza")
    Abstract: Factor model methods recently have become extremely popular in the theory and practice of large panels of time series data. Those methods rely on various factor models which all are particular cases of the Generalized Dynamic Factor Model (GDFM) introduced in Forni, Hallin, Lippi and Reichlin (2000). In that paper, however, estimation relies on Brillinger’s concept of dynamic principal components, which produces filters that are in general two-sided and therefore yield poor performances at the end of the observation period and hardly can be used for forecasting purposes. In the present paper, we remedy this problem, and show how, based on recent results on singular stationary processes with rational spectra, one-sided estimators are possible for the parameters and the common shocks in the GDFM. Consistency is obtained, along with rates. An empirical section, based on US macroeconomic time series, compares estimates based on our model with those based on the usual staticrepresentation restriction, and provide convincing evidence that the assumptions underlying the latter are not supported by the data.
    Keywords: Generalized dynamic factor models. Vector processes with singular spectral density. One-sided representations for dynamic factor models. consistency and rates for estimators of dynamic factor models.
    Date: 2011–12
    URL: http://d.repec.org/n?u=RePEc:sas:wpaper:20115&r=ets
  2. By: Klein, Ingo; Tinkl, Fabian
    Abstract: Zhang (2008) defines the quotient correlation coefficient to test for dependence and tail dependence of bivariate random samples. He shows that asymptotically the test statistics are gamma distributed. Therefore, he called the corresponding test gamma test. We want to investigate the speed of convergence by a simulation study. Zhang discusses a rank-based version of this gamma test that depends on random numbers drawn from a standard Frechet distribution. We propose an alternative that does not depend on random numbers. We compare the size and the power of this alternative with the well-known t-test, the van der Waerden and the Spearman rank test. Zhang proposes his gamma test also for situations where the dependence is neither strictly increasing nor strictly decreasing. In contrast to this, we show that the quotient correlation coefficient can only measure monotone patterns of dependence. --
    Keywords: test on dependence,rank correlation test,Spearman's p,copula,Lehmann ordering
    Date: 2011
    URL: http://d.repec.org/n?u=RePEc:zbw:faucse:872010&r=ets
  3. By: Xiangjin Shen (Rutgers University); Hiroki Tsurumi (Rutgers University)
    Abstract: We compare Bayesian and sample theory model specification criteria. For the Bayesian criteria we use the deviance information criterion and the cumulative density of the mean squared errors of forecast. For the sample theory criterion we use the conditional Kolmogorov test. We use Markov chain Monte Carlo methods to obtain the Bayesian criteria and bootstrap sampling to obtain the conditional Kolmogorov test. Two non-nested models we consider are the CIR and Vasicek models for spot asset prices. Monte Carlo experiments show that the DIC performs better than the cumulative density of the mean squared errors of forecast and the CKT. According to the DIC and the mean squared errors of forecast, the CIR model explains the daily data on uncollateralized Japanese call rate from January 1 1990 to April 18 1996; but according to the CKT, neither the CIR nor Vasicek models explains the daily data.
    Keywords: Deviance information criterion, Markov chain Monte Carlo algorithms, Block bootstrap, Conditional Kolmogorov test, CIR and Vasicek models
    JEL: C1 C5 G0
    Date: 2011–06–07
    URL: http://d.repec.org/n?u=RePEc:rut:rutres:201126&r=ets
  4. By: Cristina Fuentes-Albero (Rutgers, The State University of New Jersey); Leonardo Melosi (London Business School)
    Abstract: We introduce two new methods for estimating the Marginal Data Density (MDD) from the Gibbs output, which are based on exploiting the analytical tractability condition. Such a condition requires that some parameter blocks can be analytically integrated out from the conditional posterior densities. Our estimators are applicable to densely parameterized time series models such as VARs or DFMs. An empirical application to six-variate VAR models shows that the bias of a fully computational estimator is sufficiently large to distort the implied model rankings. One estimator is fast enough to make multiple computations of MDDs in densely parameterized models feasible.
    Keywords: Marginal likelihood, Gibbs sampler, time series econometrics, Bayesian econometrics, reciprocal importance sampling
    JEL: C11 C15 C16
    Date: 2011–10–17
    URL: http://d.repec.org/n?u=RePEc:rut:rutres:201131&r=ets
  5. By: Norman R. Swanson (Rutgers University); Nii Ayi Armah (Bank of Canada)
    Abstract: In economics, common factors are often assumed to underlie the co-movements of a set of macroeconomic variables. For this reason, many authors have used estimated factors in the construction of prediction models. In this paper, we begin by surveying the extant literature on diffusion indexes. We then outline a number of approaches to the selection of factor proxies (observed variables that proxy unobserved estimated factors) using the statistics developed in Bai and Ng (2006a,b). Our approach to factor proxy selection is examined via a small Monte Carlo experiment, where evidence supporting our proposed methodology is presented, and via a large set of prediction experiments using the panel dataset of Stock and Watson (2005). One of our main empirical findings is that our “smoothed” approaches to factor proxy selection appear to yield predictions that are often superior not only to a benchmark factor model, but also to simple linear time series models which are generally difficult to beat in forecasting competitions. In some sense, by using our approach to predictive factor proxy selection, one is able to open up the “black box” often associated with factor analysis, and to identify actual variables that can serve as primitive building blocks for (prediction) models of a host of macroeconomic variables, and that can also serve are policy instruments, for example. Our findings suggest that important observable variables include: various S&P500 variables, including stock price indices and dividend series; a 1-year Treasury bond rate; various housing activity variables; industrial production; and exchange rates.
    Keywords: diffusion index; factor, forecast, macroeconometrics, parameter estimation error, proxy
    JEL: C22
    Date: 2011–05–14
    URL: http://d.repec.org/n?u=RePEc:rut:rutres:201105&r=ets
  6. By: Norman R. Swanson (Rutgers University); Valentina Corradi (University of Warwick); Walter Distaso (Queen Mary)
    Abstract: In recent years, numerous volatility-based derivative products have been engineered. This has led to interest in constructing conditional predictive densities and confidence intervals for integrated volatility. In this paper, we propose nonparametric kernel estimators of the aforementioned quantities. The kernel functions used in our analysis are based on different realized volatility measures, which are constructed using the ex post variation of asset prices. A set of sufficient conditions under which the estimators are asymptotically equivalent to their unfeasible counterparts, based on the unobservable volatility process, is provided. Asymptotic normality is also established. The efficacy of the estimators is examined via Monte Carlo experimentation, and an empirical illustration based upon data from the New York Stock Exchange is provided.
    Keywords: Diffusions, integrated volatility, realized volatility measures, kernels, microstructure noise
    JEL: C22 C53 C14
    Date: 2011–05–14
    URL: http://d.repec.org/n?u=RePEc:rut:rutres:201108&r=ets
  7. By: Norman R. Swanson (Rutgers University); Lili Cai (Shanghai Jiao Tong University)
    Abstract: We review and construct consistent in-sample specification and out-of-sample model selection tests on conditional distributions and predictive densities associated with continuous multifactor (possibly with jumps) and (non)linear discrete models of the short term interest rate. The results of our empirical analysis are used to carry out a “horserace” comparing discrete and continuous models across multiple sample periods, forecast horizons, and evaluation intervals. Our evaluation involves comparing models during two distinct historical periods, as well as across our entire weekly sample of Eurodollar deposit rates from 1982-2008. Interestingly, when our entire sample of data is used to estimate competing models, the “best” performer in terms of distributional “fit” as well as predictive density accuracy, both in-sample and out-of-sample, is the three factor Chen (CHEN: 1996) model examined by Andersen, Benzoni and Lund (2004). Just as interestingly, a logistic type discrete smooth transition autoregression (STAR) model is preferred to the “best” continuous model (i.e. the one factor Cox, Ingersoll, and Ross (CIR: 1985) model) when comparing predictive accuracy for the “Stable 1990s” period that we examine. Moreover, an analogous result holds for the “Post 1990s” period that we examine, where the STAR model is preferred to a two factor stochastic mean model. Thus, when the STAR model is parameterized using only data corresponding to a particular sub-sample, it outperforms the “best” continuous alternative during that period. However, when models are estimated using the entire dataset, the continuous CHEN model is preferred, regardless of the variety of model specification (selection) test that is carried out. Given that it is very difficult to ascertain the particular future regime that will ensue when constructing ex ante predictions, thus, the CHEN model is our overall “winning” model, regardless of sample period.
    Keywords: interest rate, multi-factor diffusion process, specification test,, out-of-sample forecasts, block bootstrap
    JEL: C1 C5 G0
    Date: 2011–05–13
    URL: http://d.repec.org/n?u=RePEc:rut:rutres:201102&r=ets
  8. By: Norman R. Swanson (Rutgers University); Valentina Corradi (University of Warwick)
    Abstract: This paper develops tests for comparing the accuracy of predictive densities derived from (possibly misspecified) diffusion models. In particular, we first outline a simple simulation-based framework for constructing predictive densities for one-factor and stochastic volatility models. Then, we construct accuracy assessment tests that are in the spirit of Diebold and Mariano (1995) and White (2000). In order to establish the asymptotic properties of our tests, we also develop a recursive variant of the nonparametric simulated maximum likelihood estimator of Fermanian and Salani´e (2004). In an empirical illustration, the predictive densities from several models of the one-month federal funds rates are compared.
    Keywords: block bootstrap, diffusion processes, jumps, nonparametric simulated quasi maximum likelihood, parameter estimation error
    JEL: C22 C51
    Date: 2011–05–15
    URL: http://d.repec.org/n?u=RePEc:rut:rutres:201112&r=ets
  9. By: Søren Johansen (Department of Economics, University of Copenhagen and CREATES, School of Economics and Management, Aarhus University); Bent Nielsen (Department of Economics, University of Oxford)
    Abstract: Iterated one-step Huber-skip M-estimators are considered for regression problems. Each one-step estimator is a reweighted least squares estimators with zero/one weights determined by the initial estimator and the data. The asymptotic theory is given for iteration of such estimators using a tightness argument. The results apply to stationary as well as non-stationary regression problems.
    Keywords: Huber-skip; iteration; one-step M-estimators; unit roots
    JEL: C32
    Date: 2011–11–16
    URL: http://d.repec.org/n?u=RePEc:kud:kuiedp:1129&r=ets
  10. By: Markus Bibinger; Markus Reiß
    Abstract: We propose localized spectral estimators for the quadratic covariation and the spot covolatility of diffusion processes which are observed discretely with additive observation noise. The eligibility of this approach to lead to an appropriate estimation for time-varying volatilities stems from an asymptotic equivalence of the underlying statistical model to a white noise model with correlation and volatility processes being constant over small intervals. The asymptotic equivalence of the continuous-time and the discrete-time experiments are proved by a construction with linear interpolation in one direction and local means for the other. The new estimator outperforms earlier nonparametric approaches in the considered model. We investigate its finite sample size characteristics in simulations and draw a comparison between the various proposed methods.
    Keywords: asymptotic equivalence, covariation, integrated covolatility, microstructure noise, spectral adaptive estimation
    JEL: C14 C32 C58 G10
    Date: 2011–12
    URL: http://d.repec.org/n?u=RePEc:hum:wpaper:sfb649dp2011-085&r=ets
  11. By: Ola L{\o}vsletten; Martin Rypdal
    Abstract: We present an approximated maximum likelihood method for the multifractal random walk processes of [E. Bacry et al., Phys. Rev. E 64, 026103 (2001)]. The likelihood is computed using a Laplace approximation and a truncation in the dependency structure for the latent volatility. The procedure is implemented as a package in the R computer language. Its performance is tested on synthetic data and compared to an inference approach based on the generalized method of moments. The method is applied to estimate parameters for various financial stock indices.
    Date: 2011–12
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1112.0105&r=ets
  12. By: Vladimir Soloviev; Vladimir Saptsin; Dmitry Chabanenko
    Abstract: In this research the technology of complex Markov chains is applied to predict financial time series. The main distinction of complex or high-order Markov Chains and simple first-order ones is the existing of aftereffect or memory. The technology proposes prediction with the hierarchy of time discretization intervals and splicing procedure for the prediction results at the different frequency levels to the single prediction output time series. The hierarchy of time discretizations gives a possibility to use fractal properties of the given time series to make prediction on the different frequencies of the series. The prediction results for world's stock market indices is presented.
    Date: 2011–11
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1111.5254&r=ets
  13. By: Chang-Shuai Li
    Abstract: This paper demonstrates the flaws of co-persistence theory proposed by Bollerslev and Engle (1993) which cause the theory can hardly be applied. With the introduction of the half-life of decay coefficient as the measure of the persistence, and both the weak definition of persistence and co-persistence in variance, this study attempts to solve the problems by using exhaustive search algorithm for obtaining co-persistent vector. In addition, this method is illustrated to research the co-persistence of stock return volatility in 10 European countries.
    Date: 2011–12
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1112.1363&r=ets

This nep-ets issue is ©2011 by Yong Yin. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.