Skip to Content

Search: {{$root.lsaSearchQuery.q}}, Page {{$root.page}}

AIM seminar: Accelerating Convergence of Stochastic Gradient MCMC: algorithm and theory

Qi Feng, University of Michigan
Friday, March 10, 2023
3:00-4:00 PM
1084 East Hall Map
Stochastic Gradient Langevin Dynamics (SGLD) shows its advantages in multi-modal sampling and non-convex optimization, which implies broad application in machine learning, e.g., uncertainty quantification for AI safety problems, training of neural networks, etc. The core issue in this field concerns the SGLD algorithm’s acceleration and convergence rate of the continuous time (mean-field) Langevin diffusion process to its invariant distribution. In this talk, I will present the general idea of entropy dissipation and show the convergence rate analysis for linear and non-linear Fokker-Planck equations. I will show some applications by using various Langevin dynamics-based algorithms.
Building: East Hall
Event Type: Workshop / Seminar
Tags: Mathematics
Source: Happening @ Michigan from Department of Mathematics, Applied Interdisciplinary Mathematics (AIM) Seminar - Department of Mathematics