I have moved! Visit my current page here.
Hello there!
I am currently a fifth year doctoral student in mathematics at the University of Michigan. My advisers are
Roman Vershynin and
Anna Gilbert. I am on the job market this year. You may find my CV
here (last updated 12/12/17).
Research
I am broadly interested in the mathematics of data. I like to think about problems in high-dimensional probability, compressed sensing, mathematical optimization (especially in stochastic or non-convex settings), theoretical machine learning, signal processing, and randomized algorithms. Here are some specific projects that I am working on:
- Randomized Kaczmarz method for phase retrieval (with Roman Vershynin): Given Gaussian measurements, we recently proved that an easy adaptation of the randomized Kaczmarz method gives linear convergence w.h.p. when initialized in a basin of the solution. The proof relies on that fact that Gaussian measurements are delocalized in space, so generalizing this to other subgaussian measurement sets (such as Bernoulli measurements) is not so straightforward. Another interesting phenomenon is that the method seems to converge from arbitrary initializations. We are currently working on making this rigorous.
- Ensemble K-Subspaces (with John Lipor, David Hong, Dejiao Zhang and Laura Balzano): In a recent paper, Lipor, Hong, Zhang and Balzano introduced the Ensemble K-Subspaces algorithm. This algorithm works by forming a similarity matrix using the results of many independent runs of the base KSS algorithm. This algorithm seems to achive state-of-the-art performance over many benchmark data sets. We are currently working on a tighter theoretical analysis of the algorithm.
- Free Component Analysis (with Hao Wu and Raj Rao Nadakuditi): FCA is an adaptation of ICA that does demixing given a single instance of n linear equations of n independent random matrices. It makes use of the additivity property of free cumulants. In recent work, Wu and Nadakuditi showed how this works in the limit where N, the dimensions of the random matrices, tends to infinity. We are currently working on an analysis in the case of finite N.
Publications
- Yan Shuo Tan, Sparse Phase Retrieval via Sparse PCA Despite Model Misspecification: A Simplified and Extended Analysis. (2017). Preprint.
- Yan Shuo Tan, Roman Vershynin, Phase Retrieval via Randomized Kaczmarz: Theoretical Guarantees. Information and Inference, in revision (positive reviews) (2017). Preprint.
- Yan Shuo Tan, Roman Vershynin, Polynomial Time and Sample Complexity for Non-Gaussian Component Analysis: Spectral Methods. In revision (2017). Preprint.
- Yan Shuo Tan, Energy Optimization for Distributions on the Sphere and Improvement to the Welch Bounds. Electronic Communications in Probability 22 (2017). Published paper.
Teaching
At the University of Michigan, I have taught Math 110 (Accelerated Precalculus, F2015), Math 115 (Calculus I, F2013, F2014, W2016), Math 116 (Calculus II, W2014, W2015), and have been a TA for Math 215 (Multivariable Calculus, S2014). I've also taught a mini-course on information theory to fellow graduate students. I was a TA for Roman Vershynin's
mini-course on probabilistic methods for data science at PCMI 2016. I'm currently the course co-cordinator for
Math 105 (Precalculus).
More information
I did my undergraduate studies at the University of Chicago, graduating in 2013. Prior to that, I served in the military for two years, and worked as a writer/editor for half a year. I am a native of Singapore.
Expository work
Here are some expository papers/notes I wrote over the years:
- On the Diameter of Cayley Graphs of Finite Groups (2011).
- A Skeleton in the Category: The Secret Theory of Covering Spaces (2012).
- Phase Transition in the Ising Model (2017).