## TAU:0365-4736 |
## Large deviations, entropy and statistical physics | ## 2009/2010, sem. 2 |

- Lecturer
- Prof. Boris Tsirelson (School of Mathematical Sciences).
- Prerequisites
- Any course on probability theory

You can start this course with no knowledge of physics, and finish it with a clear idea of thermal equilibrium, temperature, mechanical and thermal energy, entropy, heat engine, thermodynamic cycle etc.

These notions of statistical physics and thermodynamics are much closer to mathematics than it may seem. They emerge naturally when we mix two ingredients, one mathematical and one physical,

- the concentration-of-measure phenomenon in the context of the large deviations theory,
- the basic idea of energy.

The vital idea of entropy creates a lot of confusion. Putting aside economical, psychological, social and ecological approaches to entropy we concentrate on the informational ("mathematical") and thermodynamical ("physical") entropy. Relations between these two is a topic, hotly debated in the context of "thermodynamics of computation" and "Maxwell's demon". Such relations will be examined in our course.

Do not bother, our examples will be much simpler than this engine.

However, our general theorems will hold also for this case.

(This fascinating illustration is taken
from Wikipedia.)

- Introduction.
- Large deviations, Gibbs measures.
- Entropy.
- You may also compare it with the opinion of F. Lambert (a chemist).

- Temperature.
- Paradoxes.
- Dissipation.