36-467/667
8 October 2020 (Lecture 12)
\[ \newcommand{\Expect}[1]{\mathbb{E}\left[ #1 \right]} \newcommand{\Var}[1]{\mathrm{Var}\left[ #1 \right]} \newcommand{\Cov}[1]{\mathrm{Cov}\left[ #1 \right]} \newcommand{\Neighbors}{N} \]
So a nearest-neighbors SAR corresponds to a CAR with next-nearest-neighbors, etc.
Query: Can you always find an SAR matrix to match a CAR?
Q: When will a CAR with coefficient matrix \(\mathbf{b}\) have the same distribution as a SAR with coefficient matrix \(\mathbf{b}\)?
A: Use the last equation above but require \(\mathbf{\beta} = \mathbf{b}\): \[\begin{eqnarray} \mathbf{b} & = & 2 \mathbf{b} + \mathbf{b}^2\\ \mathbf{0} & = & \mathbf{b} + \mathbf{b}^2\\ \mathbf{0} & = & \mathbf{b}(\mathbf{I} + \mathbf{b}) \end{eqnarray}\] so \(\mathbf{0} = 0\) or \(\mathbf{b} = - \mathbf{I}\)
(after Guttorp (1995, 206–7))
(after Guttorp (1995, 219–21, Exercise 4.1))
After Guttorp (1995, 7, Proposition 1.1)
\(\mathbf{X}\) is an \(n\)-dimensional random vector. Assume \(p(\mathbf{x}) > 0\) for all \(\mathbf{x}\). Then \[ \frac{p(\mathbf{x})}{p(\mathbf{y})} = \prod_{i=1}^{n}{\frac{p(x_i|x_{1:i-1}, y_{i+1:n})}{p(y_i|x_{1:i-1}, y_{i+1:n})}} \] Since \(\sum_{\mathbf{x}}{p(\mathbf{x})}=1\), this implies that the conditional distributions uniquely fix the whole distribution
Again, after Guttorp (1995, 7)
\[\begin{eqnarray} p(\mathbf{x}) & = & p(x_n|x_{1:n-1}) p(x_{1:n-1})\\ & = & p(x_n|x_{1:n-1})p(x_{1:n-1})\frac{p(y_n|x_{1:n-1})}{p(y_n|x_{1:n-1})}\\ & = & \frac{p(x_n|x_{1:n-1})}{p(y_n|x_{1:n-1})}p(x_{1:n-1}, y_n)\\ & = & \frac{p(x_n|x_{1:n-1})}{p(y_n|x_{1:n-1})} p(x_{n-1}| x_{1:n-2}, y_n) p(x_{1:n-2}, y_n)\\ & = & \frac{p(x_n|x_{1:n-1})}{p(y_n|x_{1:n-1})} \frac{p(x_{n-1}| x_{1:n-2}, y_n)}{p(y_{n-1}| x_{1:n-2}, y_n)}p(x_{1:n-2}, y_{n-1:n})\\ & \vdots &\\ & = & \frac{p(x_n|x_{1:n-1})}{p(y_n|x_{1:n-1})} \frac{p(x_{n-1}| x_{1:n-2}, y_n)}{p(y_{n-1}| x_{1:n-2}, y_n)} \ldots \frac{p(x_1|y_{2:n})}{p(y_1|y_{2:n})} \end{eqnarray}\]Geman, Stuart, and Donald Geman. 1984. “Stochastic Relaxation, Gibbs Distributions, and the Bayesian Restoration of Images.” IEEE Transactions on Pattern Analysis and Machine Intelligence 6:721–41. https://doi.org/10.1109/TPAMI.1984.4767596.
Griffeath, David. 1976. “Introduction to Markov Random Fields.” In Denumerable Markov Chains, edited by John G. Kemeny, J. Laurie Snell, and Anthony W. Knapp, Second, 425–57. Berlin: Springer-Verlag.
Guttorp, Peter. 1995. Stochastic Modeling of Scientific Data. London: Chapman; Hall.
Mandelbrot, Benoit. 1962. “The Role of Sufficiency and of Estimation in Thermodynamics.” Annals of Mathematical Statistics 33:1021–38. http://projecteuclid.org/euclid.aoms/1177704470.
Metropolis, Nicholas, Arianna W. Rosenbluth, Marshall N. Rosenbluth, Augusta H. Teller, and Edward Teller. 1953. “Equations of State Calculations by Fast Computing Machines.” Journal of Chemical Physics 21:1087–92. https://doi.org/10.1063/1.1699114.
Reichenbach, Hans. 1956. The Direction of Time. Berkeley: University of California Press.