High-Dimensional Vector Autoregressive Time Series Modeling via Tensor Decomposition

Research output: Journal Publications and ReviewsRGC 21 - Publication in refereed journalpeer-review

33 Scopus Citations
View graph of relations

Author(s)

Related Research Unit(s)

Detail(s)

Original languageEnglish
Pages (from-to)1338-1356
Journal / PublicationJournal of the American Statistical Association
Volume117
Issue number539
Online published27 Jan 2021
Publication statusPublished - Sept 2022

Abstract

The classical vector autoregressive model is a fundamental tool for multivariate time series analysis. However, it involves too many parameters when the number of time series and lag order are even moderately large. This article proposes to rearrange the transition matrices of the model into a tensor form such that the parameter space can be restricted along three directions simultaneously via tensor decomposition. In contrast, the reduced-rank regression method can restrict the parameter space in only one direction. Besides achieving substantial dimension reduction, the proposed model is interpretable from the factor modeling perspective. Moreover, to handle high-dimensional time series, this article considers imposing sparsity on factor matrices to improve the model interpretability and estimation efficiency, which leads to a sparsity-inducing estimator. For the low-dimensional case, we derive asymptotic properties of the proposed least squares estimator and introduce an alternating least squares algorithm. For the high-dimensional case, we establish nonasymptotic properties of the sparsity-inducing estimator and propose an ADMM algorithm for regularized estimation. Simulation experiments and a real data example demonstrate the advantages of the proposed approach over various existing methods. Supplementary materials for this article are available online.

Research Area(s)

  • Factor model, High-dimensional time series, Reduced-rank regression, Tucker decomposition, Variable selection