## Many Faces of Entropy or Bayesian Statistical Mechanics (Notes)

http://arxiv.org/ftp/arxiv/papers/1007/1007.1773.pdf

1. A physical system has many thermodynamic systems each of which has an entropy value.
2. The problem with theoretical attempts to explore a physical system’s ‘true’ entropy is that increased complexity creates interplay between energy and information via the diff between micro and macro-states. (i.e. the level of coarse-graining)
3. Is there an ultimate true entropy value for a physical system and could it actually be explored experimentally rather than theoretically?
4. Because of the subjective (e.g. W in Boltzmann’s S = k log W is ‘postulated’ to be micro-states within a given macro-state) nature of entropy, Jaynes suggested a statistical Bayesian method, resulting in his MAXENT principal.
5. More recently, people have looked at expressing W through experimentally measurable things like Temperature and Heat Capacity.
6. Combining 4 & 5 potentially allows the statistical mechanics approach to be explored experimentally. In fact this was suggested 80-90 years ago in forgotten work by people like G. A. Linhart.
7. Specifically Linhart derived a formula for heat capacity (away from any phase transition) as a function of temp which matched his experimental values for them.
8 Linhart managed to ‘express the probabilities of macroscopic states of the matter via heat capacity’ (it is now logically proven that starting Boltzmann statistics, heat capacity can be viewed as “the sum of jumps from one energy level to another (resulting in a particular macro-state)) NB by changing the environment, e.g. by heating it is possible to change BOTH the population of the energy levels and the energy differences among them.
9. The Linhart view considers entropy as a number of occupied energy levels in a substance, in the sense of the “storage system for energy”. “Change in entropy shows WHAT is changed in the physical-chemical system (the over-all “structure” of energy levels and their occupation), whereas heat capacity is HOW = the sum of all the possible “jumps” among the energy levels, which are necessary to achieve any particular macro-state.
10. (i) BOTH entropy and heat capacity tend to decrease to zero when absolute temperature goes to zero (extension of 3rd Law). (ii) Entropy and heat capacity diverge as T approaches infinity (since heat capacity is finite and entropy isn’t).
11. The relationship between temperature and heat capacity (Equation 6).
12. Statistical interpretation of 11: Temp/Reference_Temperature is dependent on “odds in favor of the reachability of the intrinsically most probable macro-state, with the Tref being just the temperature value where chances for a system to reach its intrinsically most probable macro-state are 50:50″
13. 11 allows continuous degrees of equilibrium (rather than being or not being in equilibrium)which allows bridging of Boltzmann’s and Gibbs’ views on the equilibrium statistical mechanics without erotic theories.
14 We might refer to the Tref value as a kind of “critical temperature” for a zero-th order phase transition.
15 Equation 8 shows Klogk (k is reciprocal of Tref, K is the constant of the proportionality between the first derivative of heat capacity with respect to entropy and the probability that the actual macrostate has not been achieved) from equation 6 as a polynomial with up to 4 solutions for Tref. “It is tempting to connect the latter with the characteristic temperatures of the heat capacity contributions from the four fundamental atomistic degrees of freedom of the matter, namely: translational, rotational, vibrational and electronic ones.”

Particularly interested by 2.

“The problem with theoretical attempts to explore a physical system’s ‘true’ entropy is that increased complexity creates interplay between energy and information via the diff between micro and macro-states. (i.e. the level of coarse-graining)”

A thought experiment to describe this might be a book such as a religious text which has different interpretations for different people, the true entropy would be the sum of these interpretations but would require pre-knowledge of the interpreters, so is very difficult to measure. For a naive reader or a child, compared to a cleric, the difference between these interpretations is easy to see as being to do with the level of coarse gaining. In fact the very ambiguity (religious texts appeal to many people by virtue of being so ambiguous that they appear to contain information at any level of coarse graining) of religious texts may be what makes them memetically successful.

The level of coarse graining perceived between any two systems depends on the past history of information exchange or learning, and may mean that the entropy of a system is always observer dependent and relative. In the same way, a system containing information about quantum mechanics will appear to a child as pure noise unless their previous interactions with the world have allowed then to achieve the level of fine graining allowing them to understand at least how to count.