Multivariate Timeseries Prediction with GNN

29
Multivariate Timeseries Prediction with GNN Zhiming Xu [email protected] Department of Computing The Hong Kong Polytechnic University May 6, 2021 Data Exploration & Extracting Lab @ PolyU GNN in Timeseries May 6, 2021 1 / 29

Transcript of Multivariate Timeseries Prediction with GNN

Page 1: Multivariate Timeseries Prediction with GNN

Multivariate Timeseries Prediction with GNN

Zhiming [email protected]

Department of ComputingThe Hong Kong Polytechnic University

May 6, 2021

Data Exploration & Extracting Lab @ PolyU GNN in Timeseries May 6, 2021 1 / 29

Page 2: Multivariate Timeseries Prediction with GNN

Content

Time Series

Statistical Learning & Prediction

Proposed Model: MTGNN

Graph Learning

Temporal Convolution

Graph Convolution

Skip Connection & Output

Results & Conclusions

Data Exploration & Extracting Lab @ PolyU GNN in Timeseries May 6, 2021 2 / 29

Page 3: Multivariate Timeseries Prediction with GNN

Time Series

I A series of data points ordered chronologically, such as each dailyprecipitation and stock closing price.

I A multivariate time series have multiple values, instead of a singleone, at each data point. For example, all S&P 500 components’stock closing prices today.

(a) Univariate Time Series. (b) Multivariate Time Series.

Figure 1: Two examples of time series.

Data Exploration & Extracting Lab @ PolyU GNN in Timeseries May 6, 2021 3 / 29

Page 4: Multivariate Timeseries Prediction with GNN

Time Series Prediction

I Input: RT×N = {r1, r2, · · · , rt}, ri ∈ RN represents all variables’values at step i.

I Output: Next Q steps based on the past P steps RP×N .

I When N = 1, it is a univariate time series. Otherwise, it is amultivariate time series.

Data Exploration & Extracting Lab @ PolyU GNN in Timeseries May 6, 2021 4 / 29

Page 5: Multivariate Timeseries Prediction with GNN

Statistical Methods

We have several statistical methods to model time series based on verysimple assumptions.

I Autoregressive (AR)

I Moving Average (MA)

I Autoregressive Moving Average (ARMA)

I Autoregressive Integrated Moving Average (ARIMA)

Data Exploration & Extracting Lab @ PolyU GNN in Timeseries May 6, 2021 5 / 29

Page 6: Multivariate Timeseries Prediction with GNN

AR and MA

Autoregressive (AR)

I Output depends linearly on its previous values and on a stochasticterm.

I AR model of order p, AR(p) (depends on the previous p steps) canbe written as Xt = c+

∑pi=1 ϕiXt−i + εt.

Moving-Average (MA)

I Output depends linearly on its current and past values of an errorterm, εt = Xt − Xt.

I MA model of order p, MA(p) (depends on the previous p stochasticterms) can be written as Xt = µ+ εt + θ1εt−1 + · · ·+ θqεt−q.

Data Exploration & Extracting Lab @ PolyU GNN in Timeseries May 6, 2021 6 / 29

Page 7: Multivariate Timeseries Prediction with GNN

ARMA and ARIMA

Autoregressive Moving-Average (ARMA)

I AR: regress on its own values in the past. MA: model error term asa combination in the present and past.

I ARMA model of order p in AR and q in MA, ARMA(p, q) can bewritten as Xt = c+ εt +

∑pi=1 ϕiXt−i +

∑qi=1 θiεt−i.

Autoregressive Integrated Moving-Average (ARIMA)

I The change between a fixed number of steps d can be modeled byARMA.

I Suppose Zt = Xt+1 −Xt, ARIMA(p, d, q) haveZt = c+ εt +

∑pi=1 ϕiZt−i +

∑qi=1 θiεt−i. Integrate over Zi’s to

get the corresponding Xt.

Data Exploration & Extracting Lab @ PolyU GNN in Timeseries May 6, 2021 7 / 29

Page 8: Multivariate Timeseries Prediction with GNN

Neural Network Methods

Apart from statistical models, neural networks are also useful in modelingtime series.

I Recurrent Neural Networks: autoregressive models conditional on allpast observations.

I Convolution Neural Networks: apply filters of different sizes tocapture temporal dependencies.

Data Exploration & Extracting Lab @ PolyU GNN in Timeseries May 6, 2021 8 / 29

Page 9: Multivariate Timeseries Prediction with GNN

Opportunities & Challenges

Opportunities:

I Current multivariate time series models study temporal dependencieswhile mostly neglect latent dependencies between pairs of variables.

I GNNs are good at capturing relational dependencies in complicatedstructures.

Challenges:

I GNNs need explicit graph structure, while in multivariate time series,such structure is usually not available.

I Even a structure is given, GNNs focus on message-passing, whileignoring the dependency changes with time.

Data Exploration & Extracting Lab @ PolyU GNN in Timeseries May 6, 2021 9 / 29

Page 10: Multivariate Timeseries Prediction with GNN

MTGNN

To address these challenges, [1] proposes MTGNN which includes thefollowing three key components:

I Graph learning layer

I m graph convolution (GC) modules

I m temporal convolution (TC) modules

Data Exploration & Extracting Lab @ PolyU GNN in Timeseries May 6, 2021 10 / 29

Page 11: Multivariate Timeseries Prediction with GNN

MTGNN (cont’d)

Figure 2: MTGNN: key components from top to bottom are graph learninglayer, temporal convolution and graph convolution modules.

Data Exploration & Extracting Lab @ PolyU GNN in Timeseries May 6, 2021 11 / 29

Page 12: Multivariate Timeseries Prediction with GNN

Table of Contents

Time Series

Statistical Learning & Prediction

Proposed Model: MTGNN

Graph Learning

Temporal Convolution

Graph Convolution

Skip Connection & Output

Results & Conclusions

Data Exploration & Extracting Lab @ PolyU GNN in Timeseries May 6, 2021 12 / 29

Page 13: Multivariate Timeseries Prediction with GNN

Graph Learning Layer

To learn a directional relations and dependencies efficiently, the graphstructure is modeled as follows

M1 = tanh (αE1Θ1)

M2 = tanh (αE2Θ2)

A = ReLU(tanh

(α(M1M

>2 −M2M

>1

)))∀j, j /∈ argtopk (A[i, :]) ,A[i, j] = 0

E1,E2 are original node embeddings. Θ1,Θ2 are learnable parameters.argtopk is a function that returns the indices of the k maximum entries.α is a hyperparameter controlling tanh’s saturation.

Data Exploration & Extracting Lab @ PolyU GNN in Timeseries May 6, 2021 13 / 29

Page 14: Multivariate Timeseries Prediction with GNN

Table of Contents

Time Series

Statistical Learning & Prediction

Proposed Model: MTGNN

Graph Learning

Temporal Convolution

Graph Convolution

Skip Connection & Output

Results & Conclusions

Data Exploration & Extracting Lab @ PolyU GNN in Timeseries May 6, 2021 14 / 29

Page 15: Multivariate Timeseries Prediction with GNN

Temporal Convolution Module

I 1D convolution is applied to capture sequential patterns in timeseries.

I To better represent such patterns, filters have different sizes and theoutputs are concatenated together.

I To further increase the receptive field size, dilated convolutioncomes into use.

(a) 1D Convolution. (b) Inception. (c) Dilated Convolution.

Figure 3: Different Convolution Filters.

Data Exploration & Extracting Lab @ PolyU GNN in Timeseries May 6, 2021 15 / 29

Page 16: Multivariate Timeseries Prediction with GNN

Temporal Convolution Module (cont’d)

I Considering typical periods in time series, inception convolution willhave four filters of different sizes: f1×2, f1×3, f1×6, f1×7.

I To cover as large receptive field as possible, dilated factor will growexponentially starting from 1 in the next layer. Suppose the factor isq, the dilated convolution will cover a receptive field

R = 1 +

m∑i=1

qi−1(c− 1) = 1 + (c− 1)qm − qq − 1

.

Data Exploration & Extracting Lab @ PolyU GNN in Timeseries May 6, 2021 16 / 29

Page 17: Multivariate Timeseries Prediction with GNN

Temporal Convolution Module (cont’d)

Take 1D sequence input z ∈ RT as an example, the inception partis

z = concat (z ? f1×2, z ? f1×3, z ? f1×6, z ? f1×7) (1)

The concatenation along the channel dimension is performed on thetruncated output.

Combined with the dilated convolution, we have:

z ? f1×k(t) =

k−1∑s=0

f1×k(s)z(t− d× s) (2)

Data Exploration & Extracting Lab @ PolyU GNN in Timeseries May 6, 2021 17 / 29

Page 18: Multivariate Timeseries Prediction with GNN

Temporal Convolution Module (cont’d)

Two temporal convolution modules of the same structure are used, oneextracting features and the other gating information flow.

Figure 4: Temporal convolution module and its dilated inception layer.

Data Exploration & Extracting Lab @ PolyU GNN in Timeseries May 6, 2021 18 / 29

Page 19: Multivariate Timeseries Prediction with GNN

Table of Contents

Time Series

Statistical Learning & Prediction

Proposed Model: MTGNN

Graph Learning

Temporal Convolution

Graph Convolution

Skip Connection & Output

Results & Conclusions

Data Exploration & Extracting Lab @ PolyU GNN in Timeseries May 6, 2021 19 / 29

Page 20: Multivariate Timeseries Prediction with GNN

Graph Convolution Module

Mix-hop Propagation Layer

I Information propagation

H(k) = βHin + (1− β)AH(k−1).

I Information selection

Hout =

K∑i=0

H(k)W(k) =

K∑i=0

(βHin + (1− β) AH(k−1)

).

I H(0) = Hin, A is the normalized graph Laplacian.

Data Exploration & Extracting Lab @ PolyU GNN in Timeseries May 6, 2021 20 / 29

Page 21: Multivariate Timeseries Prediction with GNN

Graph Convolution Module (cont’d)

Figure 5: Illustration of graph convolution module.

(a) The GC module consists of two mix-hop propagation layers. One of them takeslearned adjacency matrix A while the other takes A>.

(b) In each mix-hop propagation layer, the output Hin from last layer (residual) andthe output from the previous depth H(k−1) are used.

Data Exploration & Extracting Lab @ PolyU GNN in Timeseries May 6, 2021 21 / 29

Page 22: Multivariate Timeseries Prediction with GNN

Illustration of Temporal and Graph Convolutions

Figure 7: Temporal and graph convolution on data. N is the total number ofvariables, T is the temporal dimension and D is the feature dimension. Redfilter is the temporal convolution while Blue filter is the graph convolution.

Data Exploration & Extracting Lab @ PolyU GNN in Timeseries May 6, 2021 22 / 29

Page 23: Multivariate Timeseries Prediction with GNN

Table of Contents

Time Series

Statistical Learning & Prediction

Proposed Model: MTGNN

Graph Learning

Temporal Convolution

Graph Convolution

Skip Connection & Output

Results & Conclusions

Data Exploration & Extracting Lab @ PolyU GNN in Timeseries May 6, 2021 23 / 29

Page 24: Multivariate Timeseries Prediction with GNN

Skip Connection & Output

I Skip Connection: 1× Li 1D convolution layer where Li is the lengthof input.

I Output: Two 1× 1 convolution layers transform input to the targetoutput dimension.

Data Exploration & Extracting Lab @ PolyU GNN in Timeseries May 6, 2021 24 / 29

Page 25: Multivariate Timeseries Prediction with GNN

Learning Strategy

I Save memory in graph learning. Split nodes into several groups andlearn sub-graph structure.

I Improve short-term prediction. Use curriculum learning, graduallyincreasing # of steps to predict in the future.

Data Exploration & Extracting Lab @ PolyU GNN in Timeseries May 6, 2021 25 / 29

Page 26: Multivariate Timeseries Prediction with GNN

Performances & Ablation Study

I MTGNN achieves similar or slightly better performances comparedwith state-of-the-art time series prediction models in variousdatasets.

I Ablation study shows that each component contributes to themodel’s performance.

Table 1: Ablation study. GC means Graph Convolution and CL meanscurriculum learning.

Data Exploration & Extracting Lab @ PolyU GNN in Timeseries May 6, 2021 26 / 29

Page 27: Multivariate Timeseries Prediction with GNN

Compare Graph Learning Methods

Multiple graph construction methods are included, and the experimentsprove the advantage of using uni-directed adjacency matrix.

Table 2: Performances of different graph learning methods.

Data Exploration & Extracting Lab @ PolyU GNN in Timeseries May 6, 2021 27 / 29

Page 28: Multivariate Timeseries Prediction with GNN

Thank You for Your Attention

Q & A

Data Exploration & Extracting Lab @ PolyU GNN in Timeseries May 6, 2021 28 / 29

Page 29: Multivariate Timeseries Prediction with GNN

References

[1] Z. Wu, S. Pan, G. Long, J. Jiang, X. Chang, and C. Zhang,“Connecting the dots: Multivariate time series forecasting withgraph neural networks,” in KDD ’20: The 26th ACM SIGKDDConference on Knowledge Discovery and Data Mining, VirtualEvent, CA, USA, August 23-27, 2020, R. Gupta, Y. Liu, J. Tang,and B. A. Prakash, Eds., ACM, 2020, pp. 753–763. doi:10.1145/3394486.3403118. [Online]. Available:https://doi.org/10.1145/3394486.3403118.

Data Exploration & Extracting Lab @ PolyU GNN in Timeseries May 6, 2021 29 / 29