Skip to Content

Search: {{$root.lsaSearchQuery.q}}, Page {{$root.page}}

Statistics Department Seminar Series: Jun Zhang, Professor, Departments of Psychology and Mathematics, University of Michigan

"Information Geometry and Maximal Entropy Inference"
Friday, November 2, 2018
11:30 AM-1:00 PM
411 West Hall Map
Information Geometry is the differential geometric study of the manifold of probability models, where each probability distribution is just a point on the manifold. Instead of using metric for measuring distances on such manifolds, these applications often use “divergence functions” for measuring proximity of two points (that do not impose symmetry and triangular inequality), for instance Kullback-Leibler divergence, Bregman divergence, f-divergence, etc. Divergence functions are tied to generalized entropy (for instance, Tsallis entropy, Renyi entropy, phi-entropy) and cross-entropy functions widely used in machine learning and information sciences. After a brief introduction to IG, I illustrate the geometry of maximum entropy inference and exponential family. I then introduce a general form of entropy/cross-entropy/divergence function, and show how the geometry of the underlying probability manifold (deformed exponential family) reveals an “escort statistics” that is hidden from the standard exponential family.
Building: West Hall
Website:
Event Type: Workshop / Seminar
Tags: seminar
Source: Happening @ Michigan from Department of Statistics Seminar Series, Department of Statistics