The theory (I’ll use the older term, ‘law’) of evolution by natural selection is different from other scientific laws, in that it describes a mechanism for the emergence of order in living things, without a designer, i.e. without a particular rule, but a mechanism to self-select rules. If the idea of evolution by natural selection is extended rigorously (not merely by analogy) to all interacting systems not just living systems, then we would have a law that explains how other laws emerge, a law of laws.
The law of evolution by natural selection is such a fundamental idea that it would be surprising and unusual if it applied only to living things. Perhaps it could even be generalized to explain the emergence of all things, including the physical laws that make them. Given that human beings are part of the universe that we attempt to understand, the idea that there might be a law that itself governs the laws of the world we live and which resides within the universe itself, rather than needing an external force such as a designer, would be the only possible scientific way of describing the world meta-physically. In other words we can’t describe the box we are in any more than we can imagine what is beyond the universe, unless the design of the box is within it..
There are two things required for evolution by natural selection, amongst living things: imperfect inheritance and a constrained environment.
Both the organism and the environment are systems, one happens to be nested within the other, and the organism is by definition an open system. This nesting can be recursive, so the environment can also be an open system (e.g mitocondria within the environment of a cell, within the environment of a brain, within the environment of a human, etc.).
Inheritance implies reproduction of an organism from one generation to the next. In terms of information flow this is a high level effect. What if there were a general law that applied to any information flow. i.e. laws would emerge via information flow with: imperfect fidelity (noise) and imperfect storage (an open system).
Now we have a model consisting of interconnected systems with variable isolation and noise.
It is possible to imagine that the variables for noise and isolation are related. For example, consider a particular open system as having an imperfect membrane, rather than a direct input channel. It is possible that this membrane could have its permeability adjusted such that it both allowed information to flow in and out, but altered the information in unpredictable ways as it passed into the membrane. Normally we would consider the mechanism within the membrane to be performing a different, predictable manipulation of the input, however the model can be further simplified so that there is only one type of component, ‘information processing tubes’, wired up to create an information processing entity.
To summarize: Randomly created, interconnected systems with membranes between which have variable parameters for the quantity and quality of information flow between them will eventually lead to persistent structures (heredity) which obeyed evolution by natural selection.
These variably permeable ‘membranes’ systems can be shown to be the result of any non-deterministic cellular automaton capable of operating under a particular set of rules which allow it to represent a Universal Turing Machine (the non-determinism allows for the cycling through all possible rules).
Any information flow between two systems where there is noise, will produce random effects making it non-deterministic. Since it can be shown that there is noise between all sub-systems that interact within the same enclosing system (e.g. the universe), the necessary uncertainty and probabilistic nature of information exchange at the quantum level means that these ‘variably permeable’ membranes are a property of all information flow.
Evolution by natural selection is a high level manifestation of the mechanism which governs all information flow and all interaction in the universe. This mechanism is a property of mathematics, rather than an empirically observed phenomenon.
Candidate laws: Where information can be exchanged it will tend to be exchanged. (Note, this cannot be phrased as ‘will be exchanged’, since if the universe is non-deterministic, then one message from the range of candidates, will be exchanged at a time).
Time is neither continuous, nor absolute, it is a measure of the rate of change of state between systems. Without message exchanges between systems there is no flow of time.
Information is relative: A bit can comprise multiple bits when viewed by another system.
A bit is a measure of the relative similarity between two systems.
This way of looking at things removes the paradox where a system of maximal disorder is said to contain the most information.
For example a universe whose state is represented by: 1111….1 is very highly ordered but contains no information, whereas a universe with maximal entropy,perhaps represented by a single prime: 101110….1 could not be determined to be prime and therefore could not have any accurate information abut it, extracted.
At the universal level, a state of maximal disorder could not be ‘understood’ by any other subsystem (since there would be only one system, containing all the bits). For most cases, messages with very high information content are indistinguishable from noise. (music, taste etc.)
to cover -
modulated flow and learning - tessalated sun sphere and night day rotation of the earth to create syncing.
modulation around deep sea vents based upon hot cold boundary and bumping in and out of it.
aha - we can see between one inertial fram eof reference and another through effects such as clock ticking contracton of length etc., but if this is at the fundamental level of information exchange then things literally pop from once system to the other. (I’m not sure how to explain what I mean here)
i.e. information is of three sorts:
Potential information: information that could potentially be exchanged from system c in future if system a interacts with system b in such a way that it would be visible to system a. This possibly implies that system b can interact with system c and a, but a and c have no potential interaction to start.
Noise: noise represents the aggregate sum of all information that can be exchanged between a and b (where b is the entire background, or the environment), provided that the interaction between a and b allows for a to decode it (i.e. at each steop it flows into the adjacent possible state in such as way that some can be decoded without interaction with a third system)
Information: That which can be decoded b to a, directly.
a theory of how the sum of all possible interactions can represent noise.
entropy is measure in bits or Joules per Kelvin.
(where Kelvin - represents the total number of unobserved bits)
in other words the presence of other systems necessarily reduces the bit efficiency.
The energy per bit increases.
Since energy is conserved, but entropy always increases, the amount of bits always increases, therefore the efficiency of information storage decreases over time. The 2nd law is emergent.
Apparent increase in complexity is really driven by necessary decrease in information storage efficiency.
Note that this page refers to entropy as being a relative measure. How does one tie this in with the notion of bit efficiency.
Also - how does one form a theory of natural selection in terms of information theory.
NB - I don’t understand Smolins argument about fitness and black holes - perhaps oru unverse is an early and less fit example.
can information be destroyed - what are the semantic problems of information/bits /knowledge.
Clearly there is a problem, becuase, from the above, information can be stored in bits but bits do not equal information.
Similarly there is the seeming paradox that maximum disorder equals maximum information.
Perhaps this isn’t a paradox, except to say that a state of maxium entropy is indistinguishable from a state containing no ‘information’, however a low entropy state contains at least one bit. (that bit is relative to its surroundings?)
Need to figure what the relative notions are.
D we need to measure the entropy of the box, or the observer.
Is entropy a measure relative to the environment or relative to another system - or indeed two systems relative to their environment.
There is clearly a maximum bit efficiency - but perhaps thsi is relative to 2 systems and appears to have a fixed limit in the us relstive to our ‘universe’ and its laws.
Dispatches from the Culture Wars: Rosenhouse on ID and Information
We saw how this game of the perpetually moving goalposts was played in the recent exchanges beween Michael Egnor and practically everyone else. First he demands a specific measurement of the amount of information in the genome (something that can only be done using the Shannon definition of information and deriving a formula from it), but then claims that Shannon measurements are meaningless because they don’t measure the specified nature of the information. Heads I win, tails you lose
i.e. ID proponents mix meaning wth information
“information” is construed as inversely proportional to probability. High information content is correlated with a low probability of occurrence. In this notion, Meyer argued, any lengthy string of symbols can be viewed as containing a large amount of information, since it is only one of a very large collection of possible strings of symbols.
Temperature/pressure are a measure of how connected one system is with surrouding system(s).
A closed boundary creates this.
All living stems have closed boundaries to channel energy.
I suspect we need to further qualify energy not just as being free,but being directed.
Directed means communication from1 system to another.
need to create an abstract systems based model of narural selection and describe evolution in terms of information flow (or poss,more accurately, syncing)
Information will always transmit itself.
Systems will self configure over time to maximise this transmission, even if the information needed to build the transmitter is large.
The universe tends to eradicate difference, over time.
Entropy is a silly term in this context because it raises allsorts of paradoxes.
What causes the ripple effects of state changes to keep on going and not peter out.
(if the ‘rules’ of the system (ultimately the lawsof physics)change then all possible scenarios will operate- or if the universe were rather like a universal turing machine built from a simple CA that is universal, then it will contain all other rules nested).
How does the above not invalidate the 1st law (cons of energy).
look at michael franks notes re wave functions (expressed rather like fractals, as complex numbers) as being deterministic and only when they collapse to numbers is there a prob.
Perhaps we need to look at info theory with symbols that are complex numbers in some form?
For transactions to go viral there needs to be growth in the system at each step, i.e. not a zero sum game - there is profit. For gift giving both donor and recipient feel good, although there is zero sum in terms of money. clearly measuring the emotional currency gives a better measure. Compare this to the (Neils Bohr institute) model of the currency of fashion, which is based upon the mindshare an item has and perceived requirement rather than actual requirement, in a supply and demand algorithm.
You cannot measure the speed of a beam of light in any other direction than straight at you or a detector.
Fractional dimensions in string theory:
“There are physically interesting fractals embedded in 2D which are related to conformal field theory, which in turn is relevant for string theory. I once observed that fractal dimensions of several interesting objects in 2D, e.g. percolation cluster (D = 91/48), percolation hull (D = 7/4), self-avoiding walks (D = 4/3) and red links (D = 3/4) fit into the magic formula
D = (100 - n^2)/48, n integer.
This follows immediately from Kac’ formula for c = 0 with half-integer values, using D = 2 - 2h. I later learned that Hubert Saleur had found the same formula a few months before me (it was really quite obvious at the time), but it nevertheless paid for a four-year postdoc.
There is of course an glaring limitation with this formula: it only works for fractals embedded in 2D (like a string theory worldsheet), but the physically most interesting fractals live inside 3D. This observation got me starting on higher-dimensionalization of the algebraic structures of string theory.”