Linear Generative Models for Time Series

36-467/36-667

16 October 2018

\[ \newcommand{\Expect}[1]{\mathbb{E}\left[ #1 \right]} \newcommand{\Var}[1]{\mathrm{Var}\left[ #1 \right]} \newcommand{\SampleVar}[1]{\widehat{\mathrm{Var}}\left[ #1 \right]} \newcommand{\Cov}[1]{\mathrm{Cov}\left[ #1 \right]} \]

Previously

Linear autoregressions

\[\begin{eqnarray} X(t) & = & a + b X(t-1) + \epsilon(t)\\ X(0) & = & \text{some random variable or other} \end{eqnarray}\]

Generating a new time series

Unroll the AR(1) a little

\[\begin{eqnarray} X(t) & = & a + b X(t-1) + \epsilon(t)\\ & = & a + \epsilon(t) + b(a+b X(t-2) + \epsilon(t-1))\\ & = & a + ba + b^2 a + \ldots + b^{t-1} a + \epsilon(t) + b\epsilon(t-1) + b^2 \epsilon(t-2) + \ldots + b^{t-1} \epsilon(1) + b^t X(0)\\ & = & a\sum_{k=0}^{t-1}{b^k} + \sum_{k=0}^{t}{\epsilon(t-k) b^k} + b^t X(0) \end{eqnarray}\]

Think about the deterministic version

Set \(a=0\) to simplify book-keeping

\[\begin{eqnarray} x(t) & = & b x(t-1)\\ & = & \text{???} \end{eqnarray}\]

In-class exercise 1: Write \(x(t)\) in terms of \(b\), \(t\), and \(x(0)\)

Think about the deterministic version

Set \(a=0\) to simplify book-keeping

\[\begin{eqnarray} x(t) & = & b x(t-1)\\ & = & b^t x(0) \end{eqnarray}\]

If \(|b|<1\) then \(b^t \rightarrow 0\) as \(t\rightarrow \infty\)

So if \(|b| < 1\) then \(x(t) \rightarrow 0\)

First-order dynamics are exponential decay to 0 or growth to \(\pm \infty\)

Adding the noise back in

Constantly being perturbed away from the deterministic path

How would we predict?

Intuitively:

\[ \hat{X}(t+k) = b^{k} x(t) \]

Rigorously:

\[\begin{eqnarray} \Expect{X(t+k)|X(t)=x} & = & \Expect{b^k X(t) + \epsilon(t+k) + b \epsilon(t+k-1) + \ldots + b^{k-1}\epsilon(t)|X(t) = x}\\ & = & b^k x + 0 \end{eqnarray}\]

because innovations are uncorrelated

What about covariances?

\[\begin{eqnarray} \Cov{X(t+h), X(t)} & = & \Cov{b^h X(t) + \epsilon(t+h) + b\epsilon(t+h-1) + \ldots + b^{h-1}\epsilon(t), X(t)}\\ & = & b^h \Cov{X(t), X(t)} + 0\\ & = & b^h \Var{X(t)} \end{eqnarray}\]

\(\Rightarrow\) if \(\Var{X(t)}\) and \(\Expect{X(t)}\) are constant, this is stationary

Higher-order autoregressions

\[ X(t) = a + b_1 X(t-1) + b_2 X(t-2) + \ldots b_p X(t-p) + \epsilon(t) \]

What about multiple variables?

Vector autoregression of order 1, or VAR(1)

\[ \vec{X}(t) = \vec{a} + \mathbf{b} \vec{X}(t-1) + \vec{\epsilon}(t) \]

\(\vec{X}(t) =\) random vector of dimension \(p\), the state at time \(t\)

\(\vec{a} =\) deterministic vector of dimension \(p\)

\(\mathbf{b} =\) deterministic matrix of dimension \(p\times p\)

\(\vec{\epsilon}(t) =\) random vector of dimension \(p\), the innovation

What about multiple variables?

Zero out the offset \(\vec{a}\) for now

\[ \vec{X}(t) = \mathbf{b} \vec{X}(t-1) + \vec{\epsilon}(t) \]

What are the deterministic dynamics?

Linear dynamical systems in multiple dimensions

\[ \vec{x}(t) = \mathbf{b}\vec{x}(t-1) \]

Linear dynamical systems in multiple dimensions

Linear dynamical systems in multiple dimensions

Eigenvalues determine the dynamics of a linear system

The easy case: all eigenvalues \(\lambda_1, \ldots \lambda_p\) are real

Eigenvalues determine the dynamics of a linear system

Rotation with complex eigenvalues

b
##       [,1] [,2]
## [1,]  0.99 0.01
## [2,] -0.01 0.99
eigen(b)
## eigen() decomposition
## $values
## [1] 0.99+0.01i 0.99-0.01i
## 
## $vectors
##                      [,1]                 [,2]
## [1,] 0.0000000-0.7071068i 0.0000000+0.7071068i
## [2,] 0.7071068+0.0000000i 0.7071068+0.0000000i
Mod(eigen(b)$values)
## [1] 0.9900505 0.9900505

Rotation with complex eigenvalues

(x <- matrix(c(1, 2), nrow = 2))
##      [,1]
## [1,]    1
## [2,]    2
b %*% x
##      [,1]
## [1,] 1.01
## [2,] 1.97
b %*% b %*% x
##        [,1]
## [1,] 1.0196
## [2,] 1.9402

Rotation with complex eigenvalues

Rotation with complex eigenvalues

Reset \(\mathbf{b}^{\prime} = \mathbf{b}/|\lambda_1|\) so now both eigenvalues have modulus 1

What’s going on here?

Morals on linear, deterministic dynamical systems

Adding on noise

Summarizing

Backup: VAR(1) vs. 2nd-order dynamics

Remember that sine and cosine waves obey \(\frac{d^2 x}{dt^2}(t) = - \omega^2 x(t)\)

Say \(x_1(t)\) is a sine wave, and define \(x_2(t) = dx_1/dt\)

\[ \left[ \begin{array}{c} \frac{dx_1}{dt} \\ \frac{dx_2}{dt} \end{array}\right] = \left[ \begin{array}{c} x_2 \\ -x_1 \end{array}\right] = \left[\begin{array}{cc} 0 & 1 \\ -1 & 0\end{array}\right] \left[\begin{array}{c} x_1 \\ x_2 \end{array}\right] \]

Backup: VAR(1) vs. 2nd-order dynamics

Fix a small time increment \(h\)

\[ \left[ \begin{array}{c} \frac{x_1(t) - x_1(t-h)}{h} \\ \frac{x_2(t)-x_2(t-h)}{h} \end{array}\right] = \left[ \begin{array}{c} x_2(t-h) \\ -x_1(t-h) \end{array}\right] \]

Backup: VAR(1) vs. 2nd-order dynamics

\[ \left[ \begin{array}{c} x_1(t) - x_1(t-h) \\ x_2(t)-x_2(t-h) \end{array}\right] = \left[ \begin{array}{c} hx_2(t-h) \\ -hx_1(t-h) \end{array}\right] \]

Backup: VAR(1) vs. 2nd-order dynamics

\[ \left[ \begin{array}{c} x_1(t) \\ x_2(t) \end{array}\right] = \left[ \begin{array}{c} x_1(t-h) + hx_2(t) \\ x_2(t-h) -hx_1(t) \end{array}\right] \]

Backup: VAR(1) vs. 2nd-order dynamics

\[ \left[ \begin{array}{c} x_1(t) \\ x_2(t) \end{array}\right] = \left[ \begin{array}{cc} 1 & h \\ -h & 1 \end{array}\right] \left[\begin{array}{c} x_1(t-h) \\ x_2(t-h) \end{array}\right] \]

Backup: VAR(1) vs. 2nd-order dynamics

Backup: the intercept and re-centering

\[\begin{eqnarray} x(t) & = & a+bx(t-1) ~ \text{(assumption)}\\ x(t) - c & = & b(x(t-1) - c) ~\text{(asserted to find right} ~ c)\\ -c & = & -a -bc ~\text{(subtract 2nd line from 1st)}\\ (b-1)c & = & -a ~ \text{(re-arrange)} \end{eqnarray}\] \[\begin{eqnarray} \vec{x}(t) & = & \vec{a} + \mathbf{b}\vec{x}(t-1)\\ \vec{x}(t) - \vec{c} & = & \mathbf{b}(\vec{x}(t-1)-\vec{c})\\ (\mathbf{I} - \mathbf{b})\vec{c} & = & \vec{a} \end{eqnarray}\]