Probability in high dimensions. Math 709, Fall 2012

Instructor: Roman Vershynin
Office: 4844 East Hall
E-mail: romanv "at" umich "dot" edu

Class meets: T, Th 11:40 am - 1:00 pm in 3088 East Hall.

Office hours: T, Th 2:00 - 3:30 pm in 4844 East Hall.

Prerequisites: Familiarity with basic probability theory (as covered e.g. in Math/Stats 525) and basic functional analysis (an elementary knowledge of Hilbert and normed spaces and linear operators).

Description

The is a course on modern problems and methods in high-dimensional probability theory and its applications. The course will consist of three connected parts: 1) concentration of measure, 2) non-asymptotic analysis of random matrices, and 3) stochastic processes and geometric probability in high dimensions. The emphasis will be given to applications across statistics (covariance estimation), signal processing (compressed sensing), information theory (dimension reduction), and computer science (low-rank approximations of matrices and graph sparsification).

1. The first part of this course will explore the concentration of measure phenomenon. We will start from the classical deviation inequalities for sums of independent random variables and proceed to concentration of Lipschitz functions on the sphere, hypercube, and some discrete spaces. This is the basis for the development of dimension reduction methods (starting from Johnson-Lindenstrauss Lemma).

2. The second part will develop a non-asymptotic analysis of random matrices. This will start with a modern treatment of sums of independent random matrices. We will proceed with the spectral analysis of random matrices with independent rows and columns, focusing on their extreme singular values. We will study analytic methods like Stieltjes transform, geometric methods including covering arguments, and probabilistic tools involving Gaussian processes and decoupling. These methods will be used for the development of compressed sensing, the modern paradigm in signal processing that allows one to recover structured signals from few measurements. We also discuss applications for random sampling from matrices (i.e. estimating a matrix from a sample of its entries) and for matrix and graph sparsification.

3. The last part of the course will develop probabilistic methods for geometric problems in high dimension; this will cover some basics of geometric functional analysis. The focus will be on the geometry of high dimensional convex sets, either random or deterministic, and their random sections and projections. We will discuss John's decompositions, Kolmogorov and Gelfand widths, volume ratio and Kashin's splittings, Dvoretzky-Milman theorems on Euclidean sections, and metric entropy. Along the way, we will develop more methods for stochastic processes (Sudakov and Dudley inequalities). These developments provide intuition and techniques for dimension reduction and compressed sensing.

Textbook. There is no required textbook. The following references cover some of the material, and they are available online:

  1. R. Vershynin, Introduction to the non-asymptotic analysis of random matrices. In: Compressed Sensing: Theory and Applications, eds. Yonina Eldar and Gitta Kutyniok. Cambridge University Press, to appear.
  2. R. Vershynin, A short course on the non-asymptotic analysis of random matrices.
  3. T. Tao, Topics in random matrix theory.
  4. D. Chafai, O. Guedon, G. Lecue, A. Pajor, Interactions between compressed sensing, random matrices and high dimensional geometry, preprint.
  5. R. Vershynin, Lectures in geometric funcitonal analysis.

Grading: The base grade will be B if you attend a majority of the classes. It will be adjusted upwards according to the quality and quantity of your homework. There will be no exams.

Homework: Homework will be assigned from the lecture notes (see below). It is be divided into three sets, which are be due as follows:

Lecture Notes

Course webpage: http://www-personal.umich.edu/~romanv/teaching/2012-13/709/709.html