Skip to Content

PHYSICS GRADUATE STUDENT SYMPOSIUM<br>Information Divergence Estimation in Signal Processing and Machine Learning

Wednesday, July 1, 2015
12:00 AM
340 West Hall

Information divergence is a measure of difference between probability distributions and is important in the fields of machine learning, signal processing, statistics, and information theory. Special cases of divergence include the common measures of information known as entropy and mutual information. This talk presents multiple applications of divergence functional estimation primarily in the context of sunspot image classification, focusing on the problems of dimensionality reduction, extending existing machine learning tasks to probability distributions as features, and estimating the optimal probability of error (the Bayes error) of a classification problem. We then present a simple, computationally tractable non-parametric estimator of a wide variety of divergence functionals that achieves parametric convergence rates under certain smoothness conditions. This estimator is demonstrated by estimating bounds on the Bayes error for a classical machine learning data set.

Lunch will be served in the Don Meyer Common Room at 11:50 A.M.

Speaker:
Kevin Moon