Skip to Content

Search: {{$root.lsaSearchQuery.q}}, Page {{$root.page}}

Applied Interdisciplinary Mathematics (AIM) Seminar

A Geometric Analysis of Neural Collapse with Normalized Features
Friday, September 23, 2022
3:00-4:00 PM
1084 East Hall Map
When training overparameterized deep networks for classification tasks, it has been widely observed that the learned features exhibit a so-called ''neural collapse'' phenomenon. More specifically, for the output features of the penultimate layer, the within-class features converge to their means for each class, and the means of different classes exhibit a certain tight frame structure, which is also aligned with the last layer's classifier. As feature normalization in the last layer becomes a common practice in modern representation learning, we theoretically justify the neural collapse phenomenon for normalized features in this work. Based on an unconstrained feature model, we simplify the empirical loss function in a multi-class classification task and obtain a nonconvex optimization problem over the Riemannian manifold by constraining all features and classifiers over the sphere. In this context, we analyze the nonconvex landscape of the Riemannian optimization problem over the product of spheres, showing a benign global landscape in the sense that the only global minimizers are the neural collapse solutions while all other critical points are strict saddles with negative curvature. Speaker(s): Peng Wang (University of Michigan)
Building: East Hall
Event Type: Workshop / Seminar
Tags: Mathematics
Source: Happening @ Michigan from Department of Mathematics, Applied Interdisciplinary Mathematics (AIM) Seminar - Department of Mathematics