Skip to Content

Search: {{$root.lsaSearchQuery.q}}, Page {{$root.page}}

Special Events Seminar

Grad Thesis Defense: New Applications of Random Matrix Theory in Spin Glass and Machine Learning
Wednesday, April 24, 2019
2:00-4:00 PM
3088 East Hall Map
Recent advancement in random matrix theory is beneficial to challenging problems in many disciplines of science and engineering. In the other direction, these applications motivate a lot of new questions in random matrix theory. In this thesis, we present two applications of random matrix theory on statistical physics and machine learning.

The first part of this thesis is about the spherical Sherrington-Kirkpatrick (SSK) model in statistical physics. The $\SSK$ model is defined by a random probability measure on a high dimensional sphere. The probability measure involves the temperature and a random Hamiltonian. We consider the simplest non-trivial case where the Hamiltonian is a random symmetric quadratic form perturbed by a specific symmetric polynomial of degree one or two. It is interesting to consider the interaction between the quadratic form and the perturbations. In particular, using the obvious connection between random quadratic forms and random matrices, we study the free energies and obtain the limiting law of their fluctuations as the dimension becomes large.

The second part is devoted to a machine learning application of the random matrix theory. We develope Free component analysis (FCA) for unmixing signals in the matrix form from their linear mixtures with little prior knowledge. The matrix signals are modeled as samples of random matrices, which are further regarded as non-commutative random variables. The counterpart of scalar probability for non-commutative random variables is the free probability. Our principle of separation is to maximize free independence between the unmixed signals. This is achieved in a manner analogous to the independent component analysis (ICA) based method for unmixing independent random variables from their additive mixtures. We describe the theory, the various algorithms, and compare FCA to ICA. We show that FCA performs comparably to, and often better than, ICA in every application, such as image and speech unmixing, where ICA has been known to succeed. Speaker(s): Hao Wu (UM)
Building: East Hall
Event Type: Workshop / Seminar
Tags: Mathematics
Source: Happening @ Michigan from Department of Mathematics, Special Events - Department of Mathematics