When you run a simulation of a world of Creatures where the DNA mutation rate (from one generation to the next) is encoded in the DNA itself, you would typically expect that mutation rate would go down over the generations. This would be a sign that the population has found some reasonable fitness with its environment, and is better off not changing too much from one generation to the next.
In other worlds, you would expect any such experiment to lead to either:
But sometime none of those occur!
To test Evo(rand) on a very simple simulation, I tested the following configuration (in general principles):
With a bit of tuning, the expected behaviour was easily reached: a population of surviving creatures moving around.
But what was my surprised when I discovered that the mutation rate was staying pretty high. It was defined as DNA encoded, between 20% and 0.5%, and the average over the population stayed at 15%. Why so? The situation was even more strange when I saw that with a mutation rate allowed to be between 100% and 0.5%, the average stayed at 100%, and the population was higher than with a maximum mutation rate at 20%: the less they specialise, the better they survive...
Graph of the average mutation rate and total number of creatures, depending on the allowed maximum mutation rate:
We see that the peak population is when the maximum allowed mutation rate is 100%: the entire population maintains its mutation rate at 100% which means that the 0s and 1s of its DNA are systematically inverted from one generation to the next.
Why? Because the default behaviour is the most adapted. Indeed if all the Creatures start to have a behaviour like "I go where there are less Creatures", they end up all going to the same place at the same time: the less dense Ground all of sudden becomes the most dense one. In other words, what could be the best survival strategy for a given Creature is actually detremental to the whole population. The best survival strategy for the population as a whole is to keep moving randomly, which is exactly what the default behaviour does.
If the maximum allowed mutation rate is below 100% (for example 50%), from time to time the mutations cause the creatures to have a non empty Program and a behaviour different from the default behaviour. If that behaviour is "I go where there are less Creatures", it is actually a good strategy as long as not all Creatures in the vicinity have the same. Such behaviour survives only a few generations. For that reason only a small percentage of the population can maintain such locally adapted behaviour. This portion of the population has the lowest mutation rate (0.5%).
Graph of distribution of the mutation rate among the population, from the minimum allowed value (0.5%) and the maximum (50%):
This is why the number of creatures is only 1250, lower than when the maximum allowed mutation rate is 100%. But if the maximum allowed mutation rate is 100%, it is possible to find a DNA sequence where both that sequence and its XOR do not lead to any Action, thus maintaining the default behaviour with better reliability: the mutation rate remains at 100% and the population, being more adapted, is larger.
However, if the minimum mutation rate is lower, the Creatures can evolve to a more stable DNA, thus increasing the probability of executing the default behaviour from one generation to the next. With a minimum mutation rate at 0.01%, we have the following distribution.
And the total population obtain is:
The apparent failure of the Creatures to adapt to their environment was because the minimum mutation rate was too high. With a minimum set to 0.001%, a population size of 2200 is reached, whatever is the maximum mutation rate set between 10% and 100%. Fitness is then reached in all cases, but relying purely on the default behaviour.
Conclusion: if the mutation rate remains high, there is something wrong in your configuration. Do not doubt the power of evolution...
On a DNA of 400 bits (the average size in the above experiments), a mutation rate of 0.5% will only invert 2 bits. If the best fitness is the default behaviour, why a mutation rate of 50% is better than 0.5%?
The mutation rate (if present as a gene in the DNA) is encoded in 28 bits:
Considering that the best fitness is to perform the default behaviour, would a high mutation rate increase the probability of a given DNA sequence nor mutate to a sequence with an associated behaviour, or with a none conclusive behaviour (a condition rule that always fails, thus relying on the default behaviour)? I guess yes, but the maths are too comnplex for me to confirm it.
Another surprise is when testing the mutation range 0.001% - 100%. In this case the population raches 2500, with 40% of the population with a mutation rate of 100% and the rest at 0.001%. Why? Again, some insane maths and stats may be needed to understand this.
And if that was not all, if the maximum is set to something between 98.5% and 99.4%, something even more surprising happens: a population that reproduces itself like crazy at night!
Unfortunately I will not have the time (and probably neither the knowledge) to further investigate those observations.