Skip to Content

Search: {{$root.lsaSearchQuery.q}}, Page {{$root.page}}

Integrable Systems and Random Matrix Theory Seminar

Computing Indices of Variance, Cumulants of Mutual Information Entropy and Allied Statistics in Random Matrix Theory
Monday, November 26, 2018
4:00-5:00 PM
1866 East Hall Map
Two fundamental examples, amongst numerous others, of statistics arising in random matrix theory are the variance of the index and low order cumulants of the mutual information entropy. The variance of the index is concerned with the fluctuations in the number of eigenvalues of a GUE matrix that exceed a certain threshold, that are say, positive.

No exact evaluation of this was known until the 2012 work by the author and Forrester. Entropy and mutual information are among the most important quantities in classical and quantum information theory and are fundamental in the design and analysis of communications, signal processing and quantum systems. Similarly evaluations of low-order moments for the fluctuations in mutual information were not previously known. However both of these can be computed explicitly in terms of hypergeometric functions, primarily because of two reasons: the first is that their generating functions are tau-functions of a particular Painleve type and secondly that a "zeroth order" tau-function is a trivial function. The derivation of these results will be explained. Speaker(s): Nick Witte (Massey University (New Zealand))
Building: East Hall
Event Type: Workshop / Seminar
Tags: Mathematics
Source: Happening @ Michigan from Department of Mathematics, Integrable Systems and Random Matrix Theory Seminar - Department of Mathematics