By Peter Gransee

Friday 11/24/06 PM


I begin to have an information problem with the idea of "macro" evolution. I figured that if this type of evolution could produce people from goo, then a random number generator would eventually produce the operating code for a sentient machine.


I found though that without an appropriate fitness test, the machine would never indicate if a particular code was the correct one. In fact this fitness test had to be so complex; it essentially contained the equivalent complexity of the desired final code. As they say, “the solution was contained within the full description of the problem”. Btw, this concept is also found in the field of data compression.


In terms of information theory, I have a simple receiver and a simple transmitter. The transmitter spits out random code and the receiver applied simple rules. But then the only thing the receiver ever recognized was simple codes. For an example of this in action, find your phone number in the digits of Pi (http://pi.nersc.gov/).


And trying to express the receiver in another way was simply hiding the nut under a different shell. For example, splitting the reception into steps with each step being tested against simple rules (survival of the fittest, etc). But this required a more complex receiver system and a carefully designed method of interdependence of each node. There again, the complexity of the setup was similar to the final code.


No matter how I tried to reason this, I could not see how a simple code, replicated massively and allowed to combine under a simple set of rules and tested for fitness using a simple test, would produce my complex code. The system would only ever produce, at the most, the same amount of complexity put into it.


It was not just that the possibility was very small that a simple closed system could produce more complexity than with what it started, but evidently impossible. Up to this point, I figured the type of evolution that made people from goo was remotely possible but just very unlikely.


Maybe there is a way. Anyone is welcome to try this for themselves. J


I have found a site that provides a much better description of the problem with information creation:





When people talk about evolution, they may be talking about different things. There is a difference between adaptations constrained within a "kind" and adaptations that cross the "kind" barrier to form new kinds (a kind can contain multiple species). 


We have observed finches adapting to their environment with different beak variations. Some think this is due to new complexity being spontaneously formed in the genome while other see it as flipping the switch for complexity already stored in the genome. 


I have yet to see proof of a genome, through natural means, gain complexity. It may be possible, I accept that. I suspect that the complexity would come from the larger system and not directly from nothing. :)


In the popular media, we sometimes see a person labeled as a quack because they disagree with evolution. The problem is that not everyone is using the same meaning for the word, "evolution". In the popular sense it is the process explaining how we all got here - the process by which everything evolved from the primordial soup. But from the scientific sense, it means any adaptation. So if evolution means, "any adaptation", then there is proof that can be independently verified by non-biased observers. For example, people adapt in their thinking to new situations, so you could say our thinking evolves. But I have yet to see any scientific proof that everything evolved from the primordial soup- which is a giant leap from adaptation since it requires new complexity. So yes and no can be reasonable answers, depending on what the questioner is actually asking. 


So the popular idea of evolution has a narrow and a broad meaning. One thing that bugs me is when some people use the broad meaning willy nilly and then when you call them on it they claim they were using the narrow meaning. Own up to it!


Btw, I distinguish between complexity and information elsewhere on my site.



Obviously, I am not an expert on Evolution. I dabble in it mostly from an information perspective. And from that perspective, I have a problem with it.




Pleiotropic scaling of gene effects and the 'cost of complexity' (Nature.com)

Addresses the issue of the declining ratio of adaptation to complexity by noting that adaptation tends to be limited to related characteristics.


However, this favors even more the idea of a cordinated plan of adaptation. I have yet to find evidence that shows how complexity in the Universe arises from nothing.  



From a recent post I made on slashdot about simple systems producing complex systems:


To be really picky for the math lovers, I actually don't think pi or the Mandelbrot plots are as complex (in the true meaning of complex) as some people think. They might appear complex because they unpack into a very large system with a lot of points. But a good test of complexity is to apply data compression. When you compress a string of data to it smallest size, you can more clearly see true complexity (measured in size). Compressing a set doesn't change its complexity otherwise it would be 'lossy'. If we train a system to apply various formulas to a very long string of numbers and one of the formulas produces the same result, then the formula is interchangeable with the long string. Of course you need time to parse and enough memory to keep notes but eventually you find that the true complexity of pi is the smallest script that can produce it.


See Wolfram's "crackables".


The alternative is to accept an incongruity: we say that both sides of the equation are equal while also saying that one side is more complex.


You were saying that you found proof that a simple system can produce a complex system. This means less complexity is producing more complexity without any donations of complexity from outside systems. Not only does the Mandelbrot not help you, I don't see any other example that does.


You mentioned a "window" that allows a fraction of the whole to be viewed. I think you were possibly referring to imaginary perspective and not mathematics. In math, you either work with all the data or a fraction of it. If you work with fractions of it and unless you fudge the numbers, you cannot produce the same exact result as a formula that works with the whole.


Therefore, we reach another incongruity. Either your window is equal in complexity to the system it is viewing and your claim that a simple system produced a complex system still wants for proof or you are saying the window only represents a fraction of the entire system and therefore the window is not equal to the whole complex system.


For one system that is not equal to another system to somehow then become equal to that system, it must gain the information it is missing either from the other system or from somewhere else. Either way, the simple system must borrow from other systems and therefore can't be said to solely, "produce" the complex system. Next, we have a shell game or a simple admission of spontaneous creation of complexity.



I understand there are at least 2 definitions of complexity. The amount of data is takes to represent its most compressed form, and the amount of computational steps or cycles it takes to produce its full product (or said in another way, to completely uncompress it). Both have hard limits that cannot be exceeded and it tends to get increasingly difficult as we get closer to those limits.



It bothers me when a great scientist like Steven Pinker says as he said recently on booktv, "today, no scientifically literate person can believe that the events narrated in genesis literally took place".  The idea of literacy is that a person has been exposed to and can recall literature. He is saying in effect that being exposed to science would automatically sterilize a person of such theistic notions and therefore he seems to imply that science has provided proof contrary to the bible. It is understandable that a great scientist would also have some opinions. The problem I have is when these opinions are presented as proven and especially when presented by a person who normally deals with proof.  I have a lot of respect for Mr. Pinker and as a result have a hard time dealing with this incongruity in what he says.


The Hill Climbing Algorithm uses a relatively simple set of rules to find a local optimum. However, it will not traverse a valley between peaks. Now, the rules behind "survival of the fittest" may be significantly more complex and able to traverse valleys but the fact that it seems to rely on mutation suggests otherwise. 

I see more of a problem with mutation that with the algorithm that optimizes/discards each mutation. If mutations are primarily in the DNA/RNA/etc then small changes are more likely to succeed. Large changes are noisier and therefore more likely to fail. Interestingly, many useful adaptions can be accessed with just a small change to the DNA/RNA. 

Regardless of how large the change is, there is tremendous pressure for each generational step to provide some sort of benefit (btw, we haven't always been very accurate in determining benefit: see former "vestigial organs" like the Appendix). This limits the complexity of new structures. Therefore an evolutionary explanation for any complex structure (yes, the eye) must both assume reasonable sized steps and that each intermediate step is sufficiently beneficial. I realize there is a lot of unknowns in all of that and it is possible to have different perspectives on what we can see so far. 

To me, it appears that organisms have their current expression and many possible adaptations available as the seasons change, normal environmental cycles, etc. This means that their toolkit of adaptions must intelligently anticipate all the possible changes that can happen normally in the environment. However, apocalyptic or man-made changes/abdication are so far out of the norm that the species often just dies off.

The species die off and their connection with large changes in the environment while we continue to see constant examples of adaptation suggest that the encoding for each species prefer smaller changes (to put it mildly)

I would go further and suggest that we will find that the toolkit of adaptions for each organism is curiously optimized for a normal environmental range. This could imply that the allowed adaptions are intelligently optimized and gross mutations fail because they are noisy and less intelligent.