**Course:** MAT-280-2, CRN 50931

**Instructor:** Roman Vershynin

**Meeting times:**TR 3:10-4:25

Much of the random matrix theory revolves about the limit properties of the spectrum of a random N x N matrix A, as the dimension N increases to infinity. A remarkable example of suchÂ approach is Wigner cemi-circle law, which computes how many singular of A fall in a given interval as N -> infinity.

However, many applications require understanding what happens for a fixed N rather than in the limit. For instance, in numerical analysis, one quantizes (rounds-off) the real numbers when putting them in a computer. Quantization is usually modeled as a slight random perturbation. The stability of a system of linear equations Ax = b under the quantization depends on the condition number of the random matrix A, the ratio of the largest and the smallest singular values of A. Thus one needs to understand the spectrum of random matrices in finite dimensions N (not only in limit). Such non-asymptotic random matrix theory will be the content of this course.

The course will emphasize "soft" non-asymptotic techniques rather than "hard" results, which might be useful for other problems. These techniques will include: concentration inequalities, martingale inequalities and various methods of asymptotic convex geometry.

**Attention: this new tutorial partially supersedes the old lecture notes below,**** see the publications page. However, geometric topics (Lectures 12-14) are not covered in the new tutorial. They are going to be included in the Lecture notes in geometric functional analysis which are being written.**

These notes were taken by students, and they have not been edited.

Lecture 1. Background, Techniques, Methods.Lecture 2. Concentration of Measure.

Lecture 3. Concentration of Measure (cont'd).

Lecture 4. Dimension Reduction.

Lecture 5. Subgaussian Random Variables.

Lecture 6. Norm of a Random Matrix.

Lecture 7. Largest, Smallest, Singular Values of Random Rectangular Matrices.

Lecture 8. Dudley's Integral Inequality.

Lecture 9. Applications of Dudley's Inequality -- Sharper Bounds for Random Matrices.

Lecture 10. Slepian's Inequality - Sharpness Bounds for Gaussian Matrices.

Lecture 11. Gordon's Inequality.

Lecture 12. Sudakov's Minoration.

Lecture 13. Sections of Convex Sets via Entropy and Volume.

Lecture 14. Sections of Convex Sets via Entropy and Volume (cont'd).

Lecture 15. Invertibility of Square Gaussian Matrices, Sparse Vectors.

Lecture 16. Invertibility of Gaussian Matrices and Compressible/Incompressible Vectors.

Lecture 17. Invertibility of Subgaussian Matrices -- Small Ball Probability via the Central Limit Theorem.

Lecture 18. Strong Invertibility of Subgaussian Matrices and Small Ball Probability via Arithmetic Progressions.

Lecture 19. Small Ball Probability via Sum-Sets.

Lecture 20. The Recurrence Set (Ergodic Approach).