The Entropy Sign
“random is a matter of perspective in this case. It means that the perfect transmission code looks random to an outsider”.
“If someone says that information = uncertainty = entropy, then they are confused, or something was not stated that should have been. Those equalities lead to a contradiction, since entropy of a system increases as the system becomes more disordered. So information corresponds to disorder according to this confusion.”
“If you always take information to be a decrease in uncertainty at the receiver and you will get straightened out”.
e.g. R = Hbefore - Hafter (where R is the information transferred)
in an alphabet: c, g, a or t:
and a single character is transmitted in a noiseless channel:
uncertainty before is 4 bits, after is 0 bits.
Basically information is the amount to uncertainty eliminated. In the Shannon entropy scenario, information is something which has been transferred, rather like energy vs potential energy. A large random string can carry a lot of information, it has high ‘potential entropy’ but without transfer and elimination of uncertainty it has no information from receiver’s frame of reference. Suppose that only a certain portion of the bits in a transmitter could be interpreted by a receiver, then the remainder will be perceived as noise.
The thing that is odd about entropy is not so much the apparent but false paradox of information and randomness, but the units of entropy. Energy / temperature merely gives the signal above the noise, whereas a more interesting possible unit would be some property which included the variance in energy per bit (expand with Dyson Sphere entropy measure example).
For example, what if all information exchange (and system-wide net entropy increase) resulted in a lower energy per bit, e.g. lower frequency photons.
NB entropy is unitless.
In thermodynsmics, entropy is measured in J/K where temperature and energy are both energy measures.
In other words, in thermodynamics you could say entropy is relative (lack of) energy.