# Colloquium: Gaussian kernelized graph Laplacian: Bi-stochastic normalization and eigen-convergence

Xiuyuan Cheng (Duke)

Abstract: Eigen-data of graph Laplacian matrices are widely used in data analysis and machine learning, such as dimension reduction by spectral embedding. Many graph Laplacian methods start by building a kernelized affinity matrix from high-dimensional data points, which may lie on some unknown low-dimensional manifolds embedded in the ambient space. When clean manifold data are corrupted by high dimensional noise, it can negatively influence the performance of graph Laplacian methods. In this talk, we first introduce the use of bi-stochastic normalization to improve the robustness of graph Laplacian to high-dimensional outlier noise, possibly heteroskedastic, with a proven convergence guarantee under the manifold data setting. Next, for the important question of eigen-convergence (namely the convergence of eigenvalues and eigenvectors to the spectra of the Laplace-Beltrami operator), we show that choosing a smooth kernel function leads to improved theoretical convergence rates compared to prior results. The proof is by analyzing the Dirichlet form convergence and constructing candidate approximate eigenfunctions via convolution with the manifold heat kernel. When data density is non-uniform on the manifold, we prove the same rates for the density-corrected graph Laplacian. The theory is supported by numerical results. Joint work with Boris Landa and Nan Wu.

Talk will be in-person and on Zoom: https://umich.zoom.us/j/98734707290

Talk will be in-person and on Zoom: https://umich.zoom.us/j/98734707290

Building: | East Hall |
---|---|

Event Type: | Lecture / Discussion |

Tags: | Mathematics |

Source: | Happening @ Michigan from Colloquium Series - Department of Mathematics, Department of Mathematics, MCAIM Colloquium - Department of Mathematics |