Probability in high dimensions. Math 709, Fall 2012
Office: 4844 East Hall
E-mail: romanv "at" umich "dot" edu
Class meets: T, Th 11:40 am - 1:00 pm in 3088 East Hall.
Office hours: T, Th 2:00 - 3:30 pm in 4844 East Hall.
Familiarity with basic probability theory
(as covered e.g. in Math/Stats 525)
and basic functional analysis (an elementary knowledge of Hilbert and normed spaces and linear operators).
The is a course on modern problems and methods in high-dimensional probability theory and its applications.
The course will consist of three connected parts:
1) concentration of measure,
2) non-asymptotic analysis of random matrices, and
3) stochastic processes and geometric probability in high dimensions.
The emphasis will be given to applications across statistics (covariance estimation), signal processing (compressed sensing),
information theory (dimension reduction), and computer science (low-rank approximations of matrices and graph sparsification).
1. The first part of this course will explore the concentration of measure phenomenon. We will start from the classical deviation inequalities for sums of independent random variables and proceed to concentration of Lipschitz functions on the sphere, hypercube, and some discrete spaces. This is the basis for the development of dimension reduction methods (starting from Johnson-Lindenstrauss Lemma).
2. The second part will develop a non-asymptotic analysis of random matrices. This will start with a modern treatment of sums of independent random matrices. We will proceed with the spectral analysis of random matrices with independent rows and columns,
focusing on their extreme singular values. We will study analytic methods like Stieltjes transform, geometric methods including covering arguments, and probabilistic tools involving Gaussian processes and decoupling. These methods will be used for the development
of compressed sensing, the modern paradigm in signal processing that allows one to recover structured signals from few measurements. We also discuss applications for random sampling from matrices (i.e. estimating a matrix from a sample of its entries) and for matrix and graph sparsification.
3. The last part of the course will develop probabilistic methods for geometric problems in high dimension; this will cover some basics of geometric functional analysis. The focus will be on the geometry of high dimensional convex sets, either random or deterministic, and their random sections and projections. We will discuss John's decompositions, Kolmogorov and Gelfand widths, volume ratio and Kashin's splittings, Dvoretzky-Milman theorems on Euclidean sections, and metric entropy. Along the way, we will develop more methods for stochastic processes (Sudakov and Dudley inequalities). These developments provide intuition and techniques for dimension reduction and compressed sensing.
Textbook. There is no required textbook. The following references cover some of the material, and they are available online:
- R. Vershynin,
Introduction to the non-asymptotic analysis
of random matrices. In: Compressed Sensing: Theory and Applications, eds. Yonina Eldar and Gitta Kutyniok.
Cambridge University Press, to appear.
- R. Vershynin,
A short course
on the non-asymptotic analysis of random matrices.
- T. Tao,
Topics in random matrix theory.
- D. Chafai, O. Guedon, G. Lecue, A. Pajor,
between compressed sensing, random matrices and high dimensional geometry, preprint.
- R. Vershynin, Lectures
in geometric funcitonal analysis.
The base grade will be B if you attend a majority of the classes.
It will be adjusted upwards according to the quality and quantity of your homework.
There will be no exams.
Homework will be assigned from the lecture notes (see below). It is be divided into three sets,
which are be due as follows:
- Homework Set 1: due October 11, in class (Thursday, October 18)
- Homework Set 2: due November 20, in class (last class before Thanksgiving)
- Homework Set 3: due by December 17, 9:00 am. Put it in my department mailbox, or in the bin attached to my door (3064 EH)
- Lecture 1, 09/04.
Homework on p. 2 (rate of convergence in CLT).
- Lecture 2, 09/06.
Homework on p. 6 (prove general Hoeffding's inequality),
p. 8 (necessity of centering for the sub-gaussian growth of MGF),
p. 9 (optional: equivalence of the MGF property to other sub-gaussian properties).
- Lecture 3, 09/11.
Rotation invariance of sub-gaussian distributions.
Hoeffding's and Khinchine's inequalities for them.
Homework on p.14 (optional: centering of sub-gaussian random variables).
- Lecture 4, 09/13.
Sub-exponential random variables. Bernstein's inequality for them.
Homework on p.15 (equivalence of sub-exponential properties),
p. 16 (the square of sub-gaussian is sub-exponential).
- Lecture 5, 09/18.
Heavy-tailed random variables. Bounding their sums via order statistics.
Concentration on the sphere: equatorial bands and spherical caps.
Homework on p. 22 (prove the theorem for heavy-tailed random variables; optional: remove the logarithmic factor from it);
p. 23 (estimate the measure of the equatorial band for all epsilon; it's OK if the constants 2 change to other absolute constants).
- Lecture 6, 09/20.
Isoperimetric inequalities. Concentration (geometric and functional) on the sphere and in Gauss space.
Homework on p. 26 (norms as Lipschitz functions), p.29 (prove Gaussian concentration following the argument for the sphere).
- Lecture 7, 09/25.
Some equivalent forms of concentration. Concentration for norms of random vectors, and for their projections.
Homework on p. 31 (replacement of mean by moments in concentration inequalities),
p.32 (concentration of norms of sub-gaussian vectors).
- Lecture 8, 09/27.
Remarks on Johnson-Lindenstrauss Lemma. Survey of concentration on metric measure spaces.
Homework on p. 36 (Johnson-Lindenstrauss Lemma with Gaussian and/or sub-gaussian random matrices).
- Lecture 9, 10/02.
High-dimensional distributions. Isotropic, subgaussian random vectors.
Homework on p.41 (equivalent definition of isotropy),
p. 42 (existence of exponentially many pairwise almost orthogonal vectors),
p. 43 (importance of controlling all marginals in the definition of sub-gaussian random vectors).
- Lecture 10, 10/04.
Uniform distribution on a convex body - subgaussian directions, concentration of volume, central limit theorem (a survey).
Matrix Bernstein inequality.
Homework on p.50 (justify factoring out a norm from the trace),
p.51 (expectaion of the exponential function of a random matrix).
- Lecture 11, 10/09.
Application of matrix Bernstein inequality for covariance estimation.
Homework on p.55 (variance of a sum of independent random vectors),
p.56 (covariance estimation in Frobenius norm).
- Lecture 12, 10/11.
Structured covariance estimation (low rank, sparse covariance, sparse concentration).
Homework on p.60 (covariance estimation for non-isotropic distributions).
- Lecture 13, 10/18.
Non-commutative Khinchine inequality. Wigner's semicircular law (begin).
Homework on p.67 (optimality of log factor in NCKI for the operator norm).
- Lecture 14, 10/23.
Stieltjes transform. Proof of Wigner semicircular law (somewhat non-rigorous).
Homework pn p.73 (the expected value of a quadratic form).
- Lecture 15, 10/25.
Marchenko-Pastur and circular laws (without proof). Extreme singular values.
Gordon's bound for Gaussian matrices. Proof based on stochastic processes, Slepian's inequality.
Homework on p.77 (extreme singular values as best factors in a two-sided inequality),
p.80 (distance between rank-one matrices).
- Lecture 16, 10/29.
Concentration of singular values of Gaussian matrices. Bai-Yin law (without proof). Near isometries.
Homework on p.83 (necessity of the fourth moment in Bai-Yin law).
- Lecture 17, 11/01.
Nets. Extreme singular values of random matrices with independent sub-gaussian rows.
Homework on p.87 (the covering numbers of the sphere are exponential in n-1; monotonicity property of covering numbers).
- Lecture 18, 11/06.
Decoupling. Extreme singular values of random matrices with independent sub-gaussian columns.
- Lecture 19, 11/08.
Spectral sparsification (after Batson, Spielman, Srivastava).
- Lecture 20, 11/13.
Sparsification of graphs. Expanders. Homework on p. 104 (random graphs are expanders).
- Lecture 21, 11/15.
Dvoretzky-Milman's theorem. Geometric consequences (sections and projections of convex sets).
Homework on p. 107 (extension from an epsilon-net to the sphere).
- Lecture 22, 11/20.
Examples for Dvoretzky-Milman's theorem. Diameters of random projections of a set.
Homework on p. 112 (expectation of the sup-norm of a Gaussian random vector),
p. 113 (Dvoretzky dimension of ell-p), p. 116 (random compressions of matrices).
This is the last homework to be included in Set 3.
- Lecture 23, 11/27.
Sudakov's inequality. Sections of small codimension (a.k.a. Low M* estimate).
- Lecture 24, 11/29.
Sections of L1 ball (Kashin, Garnaev-Gluskin's theorems).
Compressed sensing. Reconstruction of signals via sections of small codimension (Low M*).
- Lecture 25, 12/04.
Sparse signal sets. Sparse recovery. Recovery by projecting onto the signal set.
- Lecture 26, 12/06.
Compressed sensing via projecting onto the signal set.
Restricted isometry property. Exact sparse recovery.
- Lecture 27, 12/11.
Random matrices satisfying the restricted isometry property. Final remarks.