Roman Vershynin | Research interests


Exploring

My main area of interest is high dimensional probability. I study random geometric structures that have diverse connections to other areas of mathematics and data science, including random matrix theory, geometric functional analysis, convex and discrete geometry, geometric combinatorics, high dimensional statistics, information theory, learning theory, signal processing, theoretical computer science and numerical analysis. My theoretical interests in high dimensional probability are centered around non-asymptotic random matrix theory, concentration inequalities, and geometric functional analysis. My application-inspired interests are centered on big data. I am working on developing mathematical foundations of compressed sensing, high dimensional estimation, and analysis of complex networks.

Here is a sample of non-technical presentations. It reflects some of my recent research interests: a presentation on random matrices, and a slightly more technical presentation on dimension reduction.

Some of my interests by area: Here is some glimpse into a couple of these areas.

Geometric functional analysis

Geometric functional analysis strives to understand and use high dimensional structures in mathematics. High dimensions often have srong regularization effect and help us to see the overall picture. While counter-intuitive, this is very similar to the methodology of probability theory. Taking more independent observations, we increase the dimension of the (product) probability space. As the dimension grows to infinity, the classical limit theorems such as the central limit theorem begin to manifest themselves. Thus the picture becomes simpler in higher than in lower dimensions.

Techniques of geometric functional analysis are useful to explore, build, or use various high-dimensional structures such as Banach spaces, convex sets, matrices, signals and other massive data sets.

Random matrix theory

At the heart of random matrix theory lies the realization that the spectrum of a random matrix H tends to stabilize as the dimensions of H grow to infinity. This phenomenon is captured by the limit laws of random matrix theory, in particular by Wigner's semicircle law, Girko's circular law, and Marchenko-Pastur law. One can think of these laws as relatives of the central limit theorem, although the way the random entries of H determine the spectrum is more complicated than the sum of the entries studied in the central limit theorem.

The limit laws offer us a clear global and asymptotic picture of the spectrum of H. In the last few years, a considerable progress was made on the more difficult local and non-asymptotic regimes. In the non-asymptotic regime, the dimensions of H are fixed rather than grow to infinity. In the local regime, one zooms in on a small part of the spectrum of H, ideally until one sees individual eigenvalues. As an important example, suppose one zooms in on zero. The location of the eigenvalue nearest zero determines the invertibility properties of H, i.e. the probability that a random matrix H is non-singular, and the typical value of the spectral norm of the inverse of H. The invertibility properties determine in turn whether the matrix H is well conditioned, which is a matterof importance in numerical analysis.

Examples of recent developments include the proofs that a random matrix H with independent entries (whether symmetric or not) is singular with an exponentially small probability, that the condition number of H is linear in the dimension, and that the eigenstructure of H is fully delocalized and unstructured -- the eigenvectors are spread out and their coefficients are highly incommensurate.