6.291
Statistical Inference, Statistical
Mechanics and the
Relationship to Information Theory
Texts | Problem Sets | Handouts
CLASS WILL BE HELD IN 4-153
Tuesday & Thursday, 2:30 - 4:00
SCHEDULE FOR REMAINDER OF Fall 2004 (see below)
Recent work
on Statistical Inference such as Statistical Inference on Graphs, Minimum
Description Length Principle for Inference, Coding and Decoding has shown
striking connections to Statistical Mechanics and Information Theory. The purpose of these lectures is to attempt
to give a systematic introduction to these developments.
Topics include:
Relative
Entropy, Entropy and some basic theorems of Large Deviations. The Variational Description of Gibbs Measures.
Gibbs Variational Principle and the Shannon-McMillan-Breiman Theorem. Bayesian Inference viewed as minimization of Free Energy.
Information Flow and Entropy Production in the Kalman-Bucy Filter and
its Nonlinear Generalizations. Large
Deviations and Shannon’s Noisy Channel Coding Theorem. Minimum Description Length Principle for Inference. Real-time Information Theory and its possible
role in Control, Networks and Biology.
to be held in ROOM 5-134