Even though a lot of mathematics is needed to support the idea of Gaussian Adaptation, GA, the idea is very simple to understand. It’s in principle like trying to inscribe a circle in a triangle, which is a simple operation with a pair of compasses and a ruler.
But, imagine that we have to use some algorithm. For instance, suppose that we have a large number of circular pasteboard discs adapted to be perfectly inscribed in a certain triangle. Suppose also that we have a starting point #1 somewhere inside the triangle. Now, take a disc and place its centre of gravity on the starting point #1. Then, cut away the parts of the disc falling outside the triangle and determine the centre of gravity of the remaining part of the disc, point #2. In the next cycle of iteration, take a new disc and place its centre of gravity on point #2. Then, if this cycle is repeated many times for centers of gravity #3, #4 etcetera I am convinced that the algorithm will end up with a disc perfectly inscribed. But of course, this is not a very efficient method. And it is not a random method like evolution.
But, the algorithm is very easily transformed to a random ditto. We need only replace the circular discs by a random number generator producing random points uniformly distributed over the circular discs. Then, we need only place the centre
m of the circular distribution on the starting point, generate a sufficiently large number of sample points, calculate the centre of gravity,
m*, of all pass samples inside the triangle,
move m to m* and
repeat the iteration many times. In this case there will be a problem with the precision, because the number of samples must be limited. Even though this algorithm is not very efficient, it is more like the natural evolution in the sense that it uses random variation and selection in cyclic repetition.
In order to make it more like the natural evolution we may replace the triangle by a function s(x), 0 < s(x) < q <= 1, such that s(x) is the probability that x will be selected as a pass sample. So far nothing has been changed as long as s(x) = 1 for all x inside the triangle and = 0 outside. We may also define the average probability of finding pass samples (mean fitness) as
P(m) = integral s(x) N(m –x) dx
where N, thus far, is the uniform distribution over a disc. The same algorithm will end up with a maximum in P = 1.
Now, allow s(x) to be a probability function of an arbitrary extension and structure and replace the uniform distribution by a Gaussian distribution N(m – x) with moment matrix M. Then we have the following theorem:
For any s(x) and for any value of P < q <= 1, there always exist a Gaussian probability density function that is adapted for maximum disorder. The necessary conditions for a local maximum are m = m* and M proportional to M*. The dual problem is also solved; P is maximized while keeping M constant.
Disorder is equivalent to
entropy, information entropy and average information. Its exponential is proportional to the volume of the concentration ellipsoid of the Gaussian, which will also be maximized.
To my knowledge the Gaussian distribution is unique in this context. An algorithm using a Gaussian random number generator and pushing m to m* in every generation will converge to a state of equilibrium maximizing the mean fitness. A question may to what extent it is applicable to the natural evolution? Necessary conditions seems to be that evolution may in some way produce Gaussian distributed quantitative characters and that it may push m to m* in every generation.