Stochastic Processes (Advanced Probability II), 36-754
Spring 2006
See here for the current version of this
course
MWF 10:30--11:20, in 232Q Baker Hall
Stochastic processes are collections of interdependent random variables.
This course is an advanced treatment of such random functions, with twin
emphases on extending the limit theorems of probability from independent to
dependent variables, and on generalizing dynamical systems from deterministic
to random time evolution. Familiarity with measure-theoretic probability (at
the level of 36-752) is essential, but the emphasis will be on developing a
sound yet intuitive understanding of the material, rather than on analytic
rigor.
The first part of the course will cover some foundational topics which
belong in the toolkit of all mathematical scientists working with random
processes: random functions; stationary processes; Markov processes; the Wiener
process and the elements of stochastic calculus. These will be followed by a
selection of more advanced and/or abstract topics which, while valuable, tend
to be neglected in the graduate curriculum: ergodic theory, which extends the
classical limit laws to dependent variables; the closely-related theory of
Markov operators, including the stochastic behavior of deterministic dynamical
systems (i.e., "chaos"); information theory, as it connects to statistical
inference and to limiting distributions; large deviations theory, which gives
rates of convergence in the limit laws; spatial and spatio-temporal processes;
and, time permitting, predictive, Markovian representations of non-Markovian
processes.
Prerequisites: As mentioned, measure-theoretic probability, at the
level of 36-752, is essential. I will also presume some familiarity with basic
stochastic processes, at the level of 36-703 ("Intermediate Probability"),
though I will not assume those memories are very fresh.
Audience: The primary audience for the course are students of
statistics, and mathematicians interested in stochastic processes. I hope,
however, that it will be useful to any mathematical scientist who uses
probabilistic models; in particular we will cover stuff which should help
physicists interested in statistical mechanics or nonlinear dynamics, computer
scientists interested in machine learning or information theory, engineers
interested in communication, control, or signal processing, economists
interested in evolutionary game theory or finance, population biologists, etc.
Grading: One to three problems will be assigned weekly. These will
either be proofs, or simulation exercises. Students whose performance on
homework is not adequate will have the opportunity to take an oral final exam
in its place.
In HTML and PDF.
Includes information on textbooks and
a detailed outline.
Readings
- For Wednesday, 22 February
- David Pollard, "Asymptotics via Empirical Processes", Statistical
Science 4 (1989): 341--354
[PDF, 2.2 M]
- Note that Pollard's 1984 book, to which this paper makes some references,
is available
online.
- For Wednesday, 12 April (optional)
- E. B. Dynkin, "Sufficient Statistics and Extreme Points", The Annals
of Probability 6 (1978): 705--730
[PDF, 2M]
- Massimiliano Badino, "An Application of Information Theory to the Problem
of the Scientific Experiment", Synthese 140
(2004): 355--389
= phil-sci/1830
Homework Assignments
Due in my mailbox at 5pm on the date stated, unless otherwise noted.
- Exercise 1.1 and Exercise 3.1 from the notes. Exercise 3.2 is not
required. (27 January) Solutions
- Exercises 5.3, 6.1 and 6.2. (6 February) Solutions
- Exercises 10.1 and 10.2. (20 February) Solutions
- Exercises 16.1, 16.2 and 16.4 (13 March)
Lecture Notes in PDF
- All notes to date
- Table of contents, giving a running listing of
definitions, theorems, etc.
- Lecture 1 (16 January). Definition of
stochastic processes, examples, random functions.
- Lecture 2 (18 January).
Finite-dimensional distributions (FDDs) of a process, consistency of a family
of FDDs, theorems of Daniell and Kolmogorov on extending consistent families to
processes
- Lecture 3 (20 January). Probability kernels and regular conditional probabilities, extending
finite-dimensional distributions defined recursively through kernels to processes (the Ionescu Tulcea theorem).
- Lecture 4 (23 January). One-parameter
processes of various sorts; shift-operator representations of one-parameter
processes.
- Lecture 5 (25 January). Three
kinds of stationarity, the relationship between strong stationarity and
measure-preserving transformations (especially shifts).
- Lecture 6 (27 January). Reminders about
filtrations and optional times, definitions of various sorts of waiting times,
and Kac's Recurrence Theorem.
- Lecture 7 (30 January). Kinds of
continuity, versions of stochastic processes, difficulties of continuity,
the notion of a separable random function.
- Lecture 8 (1 February). Existence
of separable modifications of stochastic processes, conditions for the
existence of measurable, cadlag and continuous modifications.
- Lecture 9 (3 February). Markov
processes and their transition-probability semi-groups.
- Lecture 10 (6 February). Markov
processes as transformations of IID noise; Markov processes as operators
on function spaces.
- Lecture 11 (8 February). Examples
of Markov processes (Wiener process and the logistic map).
- Lecture 12 (13 February). Generators
of homogeneous Markov processes, analogy with exponential functions.
- Lecture 13 (15 February). The
strong Markov property and the martingale problem.
- Lecture 14 (17, 20 February). Feller
processes, and an example of a Markov process which is not strongly Markovian.
- Lecture 15 (24 February, 1 March). Convergence in
distribution of cadlag processes, convergence of Feller processes,
approximation of differential equations by Markov processes.
- Lecture 16 (3 March). Convergence
of random walks, functional central limit theorem.
- Lecture 17 (6 March). Diffusions,
Wiener measure, non-differentiability of almost all continuous curves.
- Lecture 18 (8 March). Stochastic
integrals: heuristic approach via Euler's method, rigorous approach.
- Lecture 19 (20, 21, 22 and 24
March). Examples of stochastic integrals. Ito's formula for change of
variables. Stochastic differential equations, existence and uniqueness of
solutions. Physical Brownian motion: the Langevin equation, Ornstein-Uhlenbeck
processes.
- Lecture 20 (27 March). More on SDEs:
diffusions, forward (Fokker-Planck) and backward equations. White noise.
- Lecture 21 (29, 31 March). Spectral
analysis; how the white noise lost its color. Mean-square ergodicity.
- Lecture 22 (3 April). Small-noise limits
for SDEs: convergence in probability to ODEs, and our first large-deviations
calculations.
- Lecture 23 (5 April). Introduction to
ergodic properties and invariance.
- Lecture 24 (7 April). The almost-sure
(Birkhoff) ergodic theorem.
- Lecture 25 (10 April). Metric
transitivity. Examples of ergodic processes. Preliminaries on ergodic
decompositions.
- Lecture 26 (12 April). Ergodic
decompositions. Ergodic components as minimal sufficient statistics.
- Lecture 27 (14 April). Mixing.
Weak convergence of distribution and decay of correlations. Central
limit theorem for strongly mixing sequences.
- Lecture 28 (17 April). Introduction to
information theory. Relations between Shannon entropy, relative
entropy/Kullback-Leibler divergence, expected likelihood and Fisher
information.
- Lecture 29 (24 April). Entropy rate.
The asymptotic equipartition property, a.k.a. the Shannon-MacMillan-Breiman
theorem, a.k.a. the entropy ergodic theorem. Asymptotic likelihoods.
- Lecture 30 (26 April). General theory of
large deviations. Large deviations principles and rate functions; Varadhan's
Lemma. Breeding LDPs: contraction principle, "exponential tilting", Bryc's
Theorem, projective limits.
- Lecture 31 (28 April). IID large
deviations: cumulant generating functions, Legendre's transform, the return of
relative entropy. Cramer's theorem on large deviations of empirical
means. Sanov's theorem on large deviations of empirical measures.
Process-level large deviations.
- Lecture 32 (1 May). Large deviations for
Markov sequences through exponential-family densities.
- Lecture 33 (2 May). Large deviations in hypothesis testing and parameter
estimation.
- Lecture 34 (3 May). Large deviations for
weakly-dependent sequences (Gartner-Ellis theorem).
- Lecture 35 (5 May). Large deviations of
stochastic differential equations in the small-noise limit (Freidlin-Wentzell
theory).
- References. The bibliography.
RSS Feed
If you want an RSS feed for the notes, you should find
one here.