Unit Roots in Time Series with Changepoints


  •  Ed Herranz    
  •  James Gentle    
  •  George Wang    

Abstract

Many financial time series are nonstationary and are modeled as ARIMA processes; they are integrated processes (I(n)) which can be made stationary (I(0)) via differencing n times. I(1) processes have a unit root in the autoregressive polynomial. Using OLS with unit root processes often leads to spurious results; a cointegration analysis should be used instead. Unit root tests (URT) decrease spurious cointegration. The Augmented Dickey Fuller (ADF) URT fails to reject a false null hypothesis of a unit root under the presence of structural changes in intercept and/or linear trend. The Zivot and Andrews (ZA) (1992) URT was designed for unknown breaks, but not under the null hypothesis. Lee and Strazicich (2003) argued the ZA URT was biased towards stationarity with breaks and proposed a new URT with breaks in the null. When an ARMA(p,q) process with trend and/or drift that is to be tested for unit roots and has changepoints in trend and/or intercept two approaches that can be taken: One approach is to use a unit root test that is robust to changepoints. In this paper we consider two of these URT's, the Lee-Strazicich URT and the Hybrid Bai-Perron ZA URT(Herranz, 2016.)  The other approach we consider is to remove the deterministic components with changepoints using the Bai-Perron breakpoint detection method (1998, 2003), and then use a standard unit root test such as ADF in each segment. This approach does not assume that the entire time series being tested is all I(1) or I(0), as is the case with standard unit root tests. Performances of the tests were compared under various scenarios involving changepoints via simulation studies.  Another type of model for breaks, the Self-Exciting-Threshold-Autoregressive (SETAR) model is also discussed.


This work is licensed under a Creative Commons Attribution 4.0 License.