Entropy measures

Posted by | August 16, 2006 | science | No Comments

Thermodynamic entropy vs information entropy – Advanced Physics Forums

“We can choose to look at thermodynamic entropy in two different ways. One approach would say that a high entropy state is information poor because there is so much disorder, and the disorder is essentially random. The other approach would say that a high entropy state is information rich because to truly describe the exact state of randomness in all its gory detail would require lots of information.”

This outlines the confusion of the difference in the ‘sign’ between Shannon and Boltzmann entropy.
It is basically a confusion over the difference between a state which has meaning to a particular observer, and the notion of absolute meaning where bits of information are stored in the smallest possible moving (hence thermodynamic) particles. What if the latter case were subjective?

There is possibly no such thing as absolute entropy, or energy or information for that matter, merely the capacity to interact with a decoder or remote system. This is possibly explained in terms of entropy by comparing microstates to macrostates (think numbered balls vs a bunch of similar marbles).

(Sorry for the slightly random rant – I’m using this blog for public notes about entropy, in case anyone else is interested in this stuff)