The 2nd Law is often wrongly stated:

“second law of thermodynamics, which says that entropy always increases or stays the same, but never decreases.”
http://www.physorg.com/news170586562.html

Entropy does sometimes decrease, but statistically, it increases over time. i.e. the entropy of an isolated system tends to increase or remain the same.

S = – k \sum_i p_i \log p_i,\,

Thermodynamic vs information entropy:
http://en.wikipedia.org/wiki/Entropy_in_thermodynamics_and_information_theory

“the reduced Gibbs entropy is equal to the minimum number of yes/no questions that need to be answered in order to fully specify the microstate, given that we know the macrostate.”

Maxwell’s demon can be restated to show the link between information and thermodynamic entropy.

Landauer’s idea: “It is only logically irreversible operations — for example, the erasing of a bit to a known state, or the merging of two computation paths — which must be accompanied by a corresponding entropy increase.”

“Applied to the Maxwell’s demon/Szilard engine scenario, this suggests that it might be possible to “read” the state of the particle into a computing apparatus with no entropy cost; but only if the apparatus has already been SET into a known state, rather than being in a thermalized state of uncertainty. To SET (or RESET) the apparatus into this state will cost all the entropy that can be saved by knowing the state of Szilard’s particle.”

Note erasure is erasure to a ‘known state’ – this removes some of the confusion about it in many discussions.

Quantum entropy:

“It is well known that a Shannon based definition of information entropy leads in the classical case to the Boltzmann entropy. It is tempting to regard the Von Neumann entropy as the corresponding quantum mechanical definition. But the latter is problematic from quantum information point of view. Consequently Stotland, Pomeransky, Bachmat and Cohen have introduced a new definition of entropy that reflects the inherent uncertainty of quantum mechanical states.”

Units:
“The presence of Boltzmann’s constant k in the thermodynamic definitions is a historical accident, reflecting the conventional units of temperature. It is there to make sure that the statistical definition of thermodynamic entropy matches the classical entropy of Clausius, thermodynamically conjugate to temperature.”

“Most physicists do not recognize temperature, Θ, as a fundamental dimension of physical quantity since it essentially expresses the energy per particle per degree of freedom, which can be expressed in terms of energy (or mass, length, and time).”

“One of entropy’s puzzling aspects is its dimensions of energy/temperature…Entropy can be defined to be dimensionless when temperature T is defined as an energy (dubbed tempergy). Boltzmann’s constant k is then unnecessary.”
http://adsabs.harvard.edu/abs/1999AmJPh..67.1114L

Entropy may be considered to be a measure of relative energy between two systems, or a system and its environment.

Semantic problems with terms like disorder and chaos (which evoke the opposite of equilibrium) mean that dispersal is the new favored term to describe entropy. i.e. energy dispersal allows work to be done, entropy increases (2nd Law) but total energy remains the same (1st Law).

“In a study titled “Natural selection for least action” published in the Proceedings of The Royal Society A., Ville Kaila and Arto Annila of the University of Helsinki describe how the second law of thermodynamics can be written as an equation of motion to describe evolution, showing how natural selection and the principle of least action can be connected by expressing natural selection in terms of chemical thermodynamics. In this view, evolution explores possible paths to level differences in energy densities and so increase entropy most rapidly. Thus, an organism serves as an energy transfer mechanism, and beneficial mutations allow successive organisms to transfer more energy within their environment”.

Lisa Zyga (2008-08-11). “Evolution as Described by the Second Law of Thermodynamics”. Physorg.com. http://www.physorg.com/news137679868.html.

“When written as a differential equation of motion, the second law can describe evolution as an energy transfer process: natural selection tends to favor the random mutations that lead to faster entropy increases in an ecosystem. When written in integral form, the second law describes the principle of least action: motion, in general, takes the path of least energy… Then, the scientists showed how natural selection and the principle of least action can be connected by expressing natural selection in terms of chemical thermodynamics. As the scientists explain, nature explores many possible paths to level differences in energy densities, with one kind of energy transfer mechanism being different species within the larger system of the Earth…By randomly mutating individuals of a species, various paths are explored in the quest of increasing entropy most rapidly. “

Natural selection allows things to evolve to the highest rate of energy dispersal. Kaila and Annila use the principle of least action to show a connection between increasing entropy and decreasing free energy.

“The idea of using the second law of thermodynamics to describe evolution is not new. As far back as 1899, physicist Ludwig Boltzmann, a great admirer of Darwin, was contemplating about connection. Also, Alfred J. Lotka, in his main work published in 1925, expressed full confidence that biotic systems follow the same universal imperative.”

From the comments:

“life transfers energy much faster than the simple heat conduction/convection that would otherwise be the dominant mode. Every time the wind carries a seed (a condensed packet of energy), or a cheetah sprints across the desert, energy is moving faster than it could by heat dissipation alone.”

Second law of thermodynamics, states that the energy of a system tends to even itself out with its surroundings (“a system’s entropy always increases”).

Hypothesis:
If this is applied to information, then not all information can flow between two systems, this means that the entropy measure is relative, i.e. it will vary depending on which systems are measuring it. This takes account of the difference between meaning and information (i.e. not all information has meaning to a particular system. If information has no meaning then no ‘work’ (in the information sense) can be done.

Gell-Mann
Quark and Jaguar
Kolmogogorov complexity (AIC): the length of the shortest program that will create a string (based on an idealized computer with infinite storage capacity.
Subjectivity in AIC: 1. the level of course graining when describing a system as a string. 2. the hardware and software of the computer. For very long string lengths 1 and 2 become less relevant.
AIC is uncomputable (because number of theorems is infinite, Chaitin) you can just attach an upper bound.
Most string as random and AIC is highest for random strings, though not all random strings have the same AIC.

Effective complexity is then related to the description of the regularities of a system by a complex adaptive system that is observing it.
Effective Complexity: “although the algorithmic information content (AIC) for a random bit string is maximal for its length, its effective complexity is zero…at the other end of the scale of AIC, when it is near zero (e.g. a string of 1s), the effective complexity is zero…For high effective complexity AIC must be neither too low or high.”
“Any definition of complexity is necessarily context-dependent”.
The effective complexity depends on the complexity of the observer (or the past history of message exchange – the more messages exchanged the more information can be gleaned from the same string) – a random string for someone aware of the whole universe would not be a random string (there would be no such thing).

“Entropy…depends on the coarse graining – the level of detail at which the system is being described. The entropy of a system described in perfect detail would not increase, it would remain constant.”
In other words, a macrostate is something that is comprehensible to an observer (and has low entropy). Unless the observer is omniscient there will be less macrostates than microstates so any change in a macrostate will more likely result in a higher entropy microstate.

Frontiers of the 2nd Law:
http://mitworld.mit.edu/video/529

1. Adrian Bejan http://constructal.org/

Entropy and natural selection:

Fisher:
The probability of a mutation increasing the fitness of an organism decreases proportionately with the magnitude of the mutation.
Larger populations carry more variation so that they have a larger chance of survival.

“In 1922, Alfred Lotka proposed that natural selection might be understood as a physical principle which could be described in terms of the use of energy by a system,[55] a concept that was later developed by Howard Odum as the maximum power principle whereby evolutionary systems with selective advantage maximize the rate of useful energy transformation. Such concepts are sometimes relevant in the study of applied thermodynamics.”

http://en.wikipedia.org/wiki/Maximum_power_principle :

“Neither the first or second law of thermodynamics include a measure of the rate at which energy transformations or processes occur. The concept of maximum power incorporates time into measures of energy transformations. It provides information about the rate at which one kind of energy is transformed into another as well as the efficiency of that transformation.”

“The concept of maximum power can be defined as the maximum rate of useful energy transformation.”

“…it seems to this author appropriate to unite the biological and physical traditions by giving the Darwinian principle of natural selection the citation as the fourth law of thermodynamics, since it is the controlling principle in rate of heat generation and efficiency settings in irreversible biological processes.” Odum