Dates:
Horari:
Lloc:
Afegeix-ho a l'agenda (iCal)
Entropy, the Second Law and Information theory
prof. Arieh Ben Naim (ariehbennaim@gmail.com)
Department of Physical Chemistry. The Hebrew University of Jerusalem. Jerusalem, Israel.
We start with a clear distinction between Shannon’s Measure of Information (SMI) and the Thermodynamic Entropy. The first is defined on any distribution, and therefore it is a very general concept. On the other hand Entropy is defined on a very special set of distributions. Next, we show that the Shannon measure of Information (SMI) provides a solid and a quantitative basis for the interpretation of the thermodynamic entropy. For an ideal gas the entropy measures the average uncertainty in the location and momentum of the particles, as well as two corrections due to the uncertainty principle and the indistinguishability of the particles.
The derivation of the thermodynamic entropy from the SMI can be used as a new definition of entropy. This definition is superior to both the Clausius and the Boltzmann definitions of entropy. One can also extend the definition of entropy, based on SMI to any system of interacting particles at equilibrium.