Semantic problems concerning information
Entropy is not natural for people, negentropy is easier to grasp instinctively.
There is a difference between potential information storage and actual information stored.
e.g. in thermodynamic entropy a maximally disordered system can contain more information than a perfectly ordered one which, by definition contains 1 bit.
The problem of referring to ‘disorder’ when looking at macroscopic entropy in terms of gravitational systems, where clumping togther rather than randomly scatterred, represets high entropy. What defines the way entropy looks to us is to do with the forces of interaction - attractive, long range for gravity and short range replusive for thermodynamic, microscopic systems.
The link between noise in information theory and temperature in thermodynamics. Units of entropy as Joules per K. i.e. how much something has to stand out from the background noise within the channel. How this relates to efficiency is uncertain - i.e. does the background really effect the number of bits required to encode each bit in a message?
Perhaps we should talk about bit pairs instead of bits, since bits are merely a measure of relative difference. A binary 1 exists in a phase space of decimal 2 i.e. 1 or 0. Since a 1 implies the existence of a possible 0, zero is its only possibly bit pair relationship.