The University of Arizona
Please note that this event has ended!

Variational Principles in Information Theory, Uncertainty Quantification, and Machine Learning

Special Colloquium

Variational Principles in Information Theory, Uncertainty Quantification, and Machine Learning
Series: Special Colloquium
Location: MATH 401
Presenter: Jeremiah Birrell, University of Massachusetts, Amherst


Variational representations of information-theoretic quantities (e.g., relative  entropy, Renyi divergences, etc.) provide a powerful toolset for addressing many problems in uncertainty quantification, statistics, and machine learning.

I will begin by discussing an information-theoretic approach to uncertainty quantification for probabilistic systems: A probabilistic model of a "real" system has two levels of uncertainty,

1) intrinsic uncertainty due the probabilistic nature of the model,

2) model-form uncertainty, due to imperfect knowledge of the system's properties/dynamics or stemming from approximation procedures (e.g., Markovian approximation, dimensional reduction, etc.).

Model-form uncertainty leads to uncertainty in computed quantities (e.g., expected values) and quantifying this uncertainty is an important step in making robust predictions. I will discuss recent progress in the development of information-theoretic variational principles that bound quantities-of-interest over infinite-dimensional model neighborhoods. Different classes of quantities-of-interest require different approaches; I will focus on results for rare events and for stochastic processes in the long-time regime.  These results will be illustrated by applications to diffusion processes, option pricing, and large-deviations rate functions. I will also discuss the connections between these ideas and other important problems in statistics and machine learning, such as variational inference and the training of generative adversarial networks.

(Refreshments will be served in the Math Commons Room at 3:00 PM)