Large deviations, entropy and statistical physics

2009/2010, sem. 2

Prof. Boris Tsirelson (School of Mathematical Sciences).
Any course on probability theory

You can start this course with no knowledge of physics, and finish it with a clear idea of thermal equilibrium, temperature, mechanical and thermal energy, entropy, heat engine, thermodynamic cycle etc.

These notions of statistical physics and thermodynamics are much closer to mathematics than it may seem. They emerge naturally when we mix two ingredients, one mathematical and one physical,

The vital idea of entropy creates a lot of confusion. Putting aside economical, psychological, social and ecological approaches to entropy we concentrate on the informational ("mathematical") and thermodynamical ("physical") entropy. Relations between these two is a topic, hotly debated in the context of "thermodynamics of computation" and "Maxwell's demon". Such relations will be examined in our course.

Do not bother, our examples will be much simpler than this engine.
However, our general theorems will hold also for this case.

(This fascinating illustration is taken from Wikipedia.)

Lecture notes

  1. Introduction.
  2. Large deviations, Gibbs measures.
  3. Entropy.
  4. Temperature.
  5. Paradoxes.
  6. Dissipation.