Entropy first linked to information Szilard, 1929
Shannon mathematical formulation of information entropy in 48.
Followed by Brillouin.
1960s, algorithmic entropy: Kolmogorov, Chaitin and Solomonoff.
Landauer - erasing information records increases entropy by at least the amount of that destroyed.
Bennett - machine can reverse entropy flow providing there is blank memory storage to record information.
Zurek: desireability of Alorithmic Information Content (AIC) as true measure of entropy.
Instead of entropy = ignorance:
Before message transfer, entropy = current ignorance plus AIC of message.
After receipt, ignorance is less while information stored is increased. [entropy of universe (minus recording system) is reduced by a maximum amount equal to the AIC of the message, in other word Maxwell's Demon has absorbed an information amount equal to the massage AIC].
When erasure takes place, the information stored is decreased, but the ignorance of the entire system is increased by at least as much.
The relative nature of entropy is traditionally described referring to macrostates i.e. things which have meaning relative to an observer (as opposed to the microstates or size of a phase space).
The entropy measured depends on the level of detail at which the information is being described, or the ‘course graining’ (Gell-Mann). At the limit, the entropy of a system perfectly described does not change over time. If is is described in terms of a macrostate (e.g. water and oil diffusing) or a few variables (oil vs water vs watery oil), then these tend to disperse over time - this is the true significance of the 2nd law.