[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
nep-ets New Economics Papers
on Econometric Time Series
Issue of 2009‒09‒19
eight papers chosen by
Yong Yin
SUNY at Buffalo

  1. Local Whittle estimation of multivariate fractionally integrated processes By Frank S. Nielsen
  2. Dynamic Factor Models with Smooth Loadings for Analyzing the Term Structure of Interest Rates By Borus Jungbacker; Siem Jan Koopman; Michel van der Wel
  3. Nearly Efficient Likelihood Ratio Tests of the Unit Root Hypothesis By Michael Jansson; Morten Ørregaard Nielsen
  4. Efficient likelihood evaluation of state-space representations By DeJong, David N.; Dharmarajan, Hariharan; Liesenfeld, Roman; Moura, Guilherme V.; Richard , Jean-François
  5. "Alternative Asymmetric Stochastic Volatility Models" By Manabu Asai; Michael McAleer
  6. "Testing the Box-Cox Parameter in an Integrated Process" By Jian Huang; Masahito Kobayashi; Michael McAleer
  7. "Dynamic Conditional Correlations for Asymmetric Processes" By Manabu Asai; Michael McAleer
  8. Forecasting economy with Bayesian autoregressive distributed lag model: choosing optimal prior in economic downturn By Bušs, Ginters

  1. By: Frank S. Nielsen (Aarhus University and CREATES)
    Abstract: This paper derives a semiparametric estimator of multivariate fractionally integrated processes covering both stationary and non-stationary values of d. We utilize the notion of the extended discrete Fourier transform and periodogram to extend the multivariate local Whittle estimator of Shimotsu (2007) to cover non-stationary values of d. We show consistency and asymptotic normality for d between -1/2 and infinity. A simulation study illustrates the performance of the proposed estimator for relevant sample sizes. Empirical justification of the proposed estimator is shown through an empirical analysis of log spot exchange rates. We find that the log spot exchange rates of Germany, United Kingdom, Japan, Canada, France, Italy, and Switzerland against the US Dollar for the period January 1974 until December 2001 are well decribed as I (1) processes.
    Keywords: fractional integration, local Whittle, long memory, multivariate semiparametric es- timation, exchange rates.
    JEL: C14 C32
    Date: 2009–09–08
    URL: http://d.repec.org/n?u=RePEc:aah:create:2009-38&r=ets
  2. By: Borus Jungbacker (Department of Econometrics, VU University Amsterdam); Siem Jan Koopman (Department of Econometrics, VU University Amsterdam and Tinbergen Institute); Michel van der Wel (Tinbergen Institute + Erasmus School of Economics, ERIM Rotterdam + CREATES, Aarhus University)
    Abstract: We propose a new approach to the modelling of the term structure of interest rates. We consider the general dynamic factor model and show how to impose smoothness restrictions on the factor loadings. We further present a statistical procedure based on Wald tests that can be used to find a suitable set of such restrictions. We present these developments in the context of term structure models, but they are also applicable in other settings. We perform an empirical study using a data set of unsmoothed Fama- Bliss zero yields for US treasuries of different maturities. The general dynamic factor model with and without smooth loadings is considered in this study together with models that are associated with Nelson-Siegel and arbitrage-free frameworks. These existing models can be regarded as special cases of the dynamic factor model with restrictions on the model parameters. For all model candidates, we consider both stationary and nonstationary autoregressive processes (with different numbers of lags) for the latent factors. Finally, we perform statistical hypothesis tests to verify whether the restrictions imposed by the models are supported by the data. Our main conclusion is that smoothness restrictions can be imposed on the loadings of dynamic factor models for the term structure of US interest rates but that the restrictions implied by a number of popular term structure models are rejected.
    Keywords: Fama-Bliss data set, Kalman filter, Maximum likelihood, Yield curve
    JEL: C32 C51 E43
    Date: 2009–09–08
    URL: http://d.repec.org/n?u=RePEc:aah:create:2009-39&r=ets
  3. By: Michael Jansson (UC Berkeley and CREATES); Morten Ørregaard Nielsen (Queen's University and CREATES)
    Abstract: Seemingly absent from the arsenal of currently available "nearly efficient" testing procedures for the unit root hypothesis, i.e. tests whose local asymptotic power functions are indistinguishable from the Gaussian power envelope, is a test admitting a (quasi-)likelihood ratio interpretation. We show that the likelihood ratio unit root test derived in a Gaussian AR(1) model with standard normal innovations is nearly efficient in that model. Moreover, these desirable properties carry over to more complicated models allowing for serially correlated and/or non-Gaussian innovations.
    Keywords: Likelihood Ratio Test, Unit Root Hypothesis
    JEL: C12 C22
    Date: 2009–08–31
    URL: http://d.repec.org/n?u=RePEc:aah:create:2009-37&r=ets
  4. By: DeJong, David N.; Dharmarajan, Hariharan; Liesenfeld, Roman; Moura, Guilherme V.; Richard , Jean-François
    Abstract: We develop a numerical procedure that facilitates efficient likelihood evaluation in applications involving non-linear and non-Gaussian state-space models. The procedure approximates necessary integrals using continuous approximations of target densities. Construction is achieved via efficient importance sampling, and approximating densities are adapted to fully incorporate current information. We illustrate our procedure in applications to dynamic stochastic general equilibrium models.
    Keywords: particle filter,adaption,efficient importance sampling,kernel density approximation,dynamic stochastic general equilibrium model
    Date: 2009
    URL: http://d.repec.org/n?u=RePEc:zbw:cauewp:200902&r=ets
  5. By: Manabu Asai (Faculty of Economics, Soka University); Michael McAleer (Econometric Institute, Erasmus School of Economics, Erasmus University Rotterdam and Tinbergen Institute and Center for International Research on the Japanese Economy (CIRJE), Faculty of Economics, University of Tokyo)
    Abstract: The stochastic volatility model usually incorporates asymmetric effects by introducing the negative correlation between the innovations in returns and volatility. In this paper, we propose a new asymmetric stochastic volatility model, based on the leverage and size effects. The model is a generalization of the exponential GARCH (EGARCH) model of Nelson (1991). We consider categories for asymmetric effects, which describes the difference among the asymmetric effect of the EGARCH model, the threshold effects indicator function of Glosten, Jagannathan and Runkle (1992), and the negative correlation between the innovations in returns and volatility. The new model is estimated by the efficient importance sampling method of Liesenfeld and Richard (2003), and the finite sample properties of the estimator are investigated using numerical simulations. Four financial time series are used to estimate the alternative asymmetric SV models, with empirical asymmetric effects found to be statistically significant in each case. The empirical results for S&P 500 and Yen/USD returns indicate that the leverage and size effects are significant, supporting the general model. For TOPIX and USD/AUD returns, the size effect is insignificant, favoring the negative correlation between the innovations in returns and volatility. We also consider standardized t distribution for capturing the tail behavior. The results for Yen/USD returns show that the model is correctly specified, while the results for three other data sets suggest there is scope for improvement.
    Date: 2009–08
    URL: http://d.repec.org/n?u=RePEc:tky:fseres:2009cf655&r=ets
  6. By: Jian Huang (Guangdong University of Finance); Masahito Kobayashi (Faculty of Economics, Yokohama National University); Michael McAleer (Econometric Institute, Erasmus School of Economics, Erasmus University Rotterdam and Tinbergen Institute and Center for International Research on the Japanese Economy (CIRJE), Faculty of Economics, University of Tokyo)
    Abstract: This paper analyses the constant elasticity of volatility (CEV) model suggested by [6]. The CEV model without mean reversion is shown to be the inverse Box-Cox transformation of integrated processes asymptotically. It is demonstrated that the maximum likelihood estimator of the power parameter has a nonstandard asymptotic distribution, which is expressed as an integral of Brownian motions, when the data generating process is not mean reverting. However, it is shown that the t-ratio follows a standard normal distribution asymptotically, so that the use of the conventional t-test in analyzing the power parameter of the CEV model is justified even if there is no mean reversion, as is often the case in empirical research. The model may applied to ultra high frequency data
    Date: 2009–09
    URL: http://d.repec.org/n?u=RePEc:tky:fseres:2009cf661&r=ets
  7. By: Manabu Asai (Faculty of Economics, Soka University); Michael McAleer (Econometric Institute, Erasmus School of Economics, Erasmus University Rotterdam and Tinbergen Institute and Center for International Research on the Japanese Economy (CIRJE), Faculty of Economics, University of Tokyo)
    Abstract: The paper develops two Dynamic Conditional Correlation (DCC) models, namely the Wishart DCC (WDCC) model and the Matrix-Exponential Conditional Correlation (MECC) model. The paper applies the WDCC approach to the exponential GARCH (EGARCH) and GJR models to propose asymmetric DCC models. We use the standardized multivariate t-distribution to accommodate heavy-tailed errors. The paper presents an empirical example using the trivariate data of the Nikkei 225, Hang Seng and Straits Times Indices for estimating and forecasting the WDCC-EGARCH and WDCC-GJR models, and compares the performance with the asymmetric BEKK model. The empirical results show that AIC and BIC favour the WDCC-EGARCH model to the WDCC-GJR and asymmetric BEKK models. Moreover, the empirical results indicate that the WDCC-EGARCH-t model produces reasonable VaR threshold forecasts, which are very close to the nominal 1% to 3% values.
    Date: 2009–08
    URL: http://d.repec.org/n?u=RePEc:tky:fseres:2009cf657&r=ets
  8. By: Bušs, Ginters
    Abstract: Bayesian inference requires an analyst to set priors. Setting the right prior is crucial for precise forecasts. This paper analyzes how optimal prior changes when an economy is hit by a recession. For this task, an autoregressive distributed lag (ADL) model is chosen. The results show that a sharp economic slowdown changes the optimal prior in two directions. First, it changes the structure of the optimal weight prior, setting smaller weight on the lagged dependent variable compared to variables containing more recent information. Second, greater uncertainty brought by a rapid economic downturn requires more space for coefficient variation, which is set by the overall tightness parameter. It is shown that the optimal overall tightness parameter may increase to such an extent that Bayesian ADL becomes equivalent to frequentist ADL.
    Keywords: Forecasting; Bayesian inference; Bayesian autoregressive distributed lag model; optimal prior; Litterman prior; business cycle; mixed estimation; grid search
    JEL: C52 C11 N14 C32 C13 C53 E17 C15 C22
    Date: 2009–09–13
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:17273&r=ets

This nep-ets issue is ©2009 by Yong Yin. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.