08-08-2008, 06:53 AM | #1 (permalink) |
Upright
Location: Sweden, Stockholm
|
Gaussian adaptation as a model of evolution
Even though a lot of mathematics is needed to support the idea of Gaussian Adaptation, GA, the idea is very simple to understand. It’s in principle like trying to inscribe a circle in a triangle, which is a simple operation with a pair of compasses and a ruler.
But, imagine that we have to use some algorithm. For instance, suppose that we have a large number of circular pasteboard discs adapted to be perfectly inscribed in a certain triangle. Suppose also that we have a starting point #1 somewhere inside the triangle. Now, take a disc and place its centre of gravity on the starting point #1. Then, cut away the parts of the disc falling outside the triangle and determine the centre of gravity of the remaining part of the disc, point #2. In the next cycle of iteration, take a new disc and place its centre of gravity on point #2. Then, if this cycle is repeated many times for centers of gravity #3, #4 etcetera I am convinced that the algorithm will end up with a disc perfectly inscribed. But of course, this is not a very efficient method. And it is not a random method like evolution. But, the algorithm is very easily transformed to a random ditto. We need only replace the circular discs by a random number generator producing random points uniformly distributed over the circular discs. Then, we need only place the centre m of the circular distribution on the starting point, generate a sufficiently large number of sample points, calculate the centre of gravity, m*, of all pass samples inside the triangle, move m to m* and repeat the iteration many times. In this case there will be a problem with the precision, because the number of samples must be limited. Even though this algorithm is not very efficient, it is more like the natural evolution in the sense that it uses random variation and selection in cyclic repetition. In order to make it more like the natural evolution we may replace the triangle by a function s(x), 0 < s(x) < q <= 1, such that s(x) is the probability that x will be selected as a pass sample. So far nothing has been changed as long as s(x) = 1 for all x inside the triangle and = 0 outside. We may also define the average probability of finding pass samples (mean fitness) as P(m) = integral s(x) N(m –x) dx where N, thus far, is the uniform distribution over a disc. The same algorithm will end up with a maximum in P = 1. Now, allow s(x) to be a probability function of an arbitrary extension and structure and replace the uniform distribution by a Gaussian distribution N(m – x) with moment matrix M. Then we have the following theorem: For any s(x) and for any value of P < q <= 1, there always exist a Gaussian probability density function that is adapted for maximum disorder. The necessary conditions for a local maximum are m = m* and M proportional to M*. The dual problem is also solved; P is maximized while keeping M constant. Disorder is equivalent to entropy, information entropy and average information. Its exponential is proportional to the volume of the concentration ellipsoid of the Gaussian, which will also be maximized. To my knowledge the Gaussian distribution is unique in this context. An algorithm using a Gaussian random number generator and pushing m to m* in every generation will converge to a state of equilibrium maximizing the mean fitness. A question may to what extent it is applicable to the natural evolution? Necessary conditions seems to be that evolution may in some way produce Gaussian distributed quantitative characters and that it may push m to m* in every generation. Last edited by gregor; 08-08-2008 at 07:00 AM.. |
08-08-2008, 08:11 AM | #2 (permalink) |
Super Moderator
Location: essex ma
|
on what basis do you assume that theories rooted in equilibrium obtain for bio-systems?
if you can't explain that, you can't do the mapping. but the distribution is interesting on it's own--thinking about it. so thanks.
__________________
a gramophone its corrugated trumpet silver handle spinning dog. such faithfulness it hear it make you sick. -kamau brathwaite |
08-08-2008, 09:57 AM | #3 (permalink) | |
Upright
Location: Sweden, Stockholm
|
Quote:
-----Added 8/8/2008 at 02 : 07 : 29----- Bergström, R. M., 1969. An Entropy model of the Developing Brain. Developmental Psychobiology, 2(3): 139-152. Bergström, M. Hjärnans resurser. Brain Books, ISBN 91-88410-07-2, Jönköping, 1992. (Swedish). Bergström, M. Neuropedagogik. En skola för hela hjärnan. Wahlström & Widstrand, 1995. (Swedish). Brooks, D. R. & Wiley, E. O. Evolution as Entropy, Towards a unified theory of Biology. The University of Chicago Press, 1986. Brooks, D. R. Evolution in the Information Age: Rediscovering the Nature of the Organism. Semiosis, Evolution, Energy, Development, Volume 1, Number 1, March 2001 Cramér, H. Mathematical Methods of Statistics. Princeton, Princeton University Press, 1961. Dawkins, R. The Selfish Gene. Oxford University Press, 1976. Eigen, M. Steps towards life. Oxford University Press, 1992. Gaines, Brian R. Knowledge Management in Societies of Intelligent Adaptive Agents. Journal of intelligent Information systems 9, 277-298 (1997). Goldberg, D. E. Genetic Algorithms in Search Optimization & Machine Learning. Addison-Wesley, New York, 1989. Hamilton, WD. 1963. The evolution of altruistic behavior. American Naturalist 97:354-356 Hartl, D. L. A Primer of Population Genetics. Sinauer, Sunderland, Massachusetts, 1981. Kandel, E. R., Schwartz, J. H., Jessel, T. M. Essentials of Neural Science and Behavior. Prentice Hall International, London, 1995. Kjellström, G. Network Optimization by Random Variation of component values. Ericsson Technics, vol. 25, no. 3, pp. 133-151, 1969. Kjellström, G. Optimization of electrical Networks with respect to Tolerance Costs. Ericsson Technics, no. 3, pp. 157-175, 1970. Kjellström, G. & Taxén, L. Stochastic Optimization in System Design. IEEE Trans. on Circ. and Syst., vol. CAS-28, no. 7, July 1981. Kjellström, G. On the Efficiency of Gaussian Adaptation. Journal of Optimization Theory and Applications, vol. 71, no. 3, Dec. 1991. Kjellström, G. & Taxén, L. Gaussian Adaptation, an evolution-based efficient global optimizer; Computational and Applied Mathematics, In, C. Brezinski & U. Kulish (Editors), Elsevier Science Publishers B. V., pp 267-276, 1992. Kjellström, G. Evolution as a statistical optimization algorithm. Evolutionary Theory 11:105-117 (January, 1996). Kjellström, G. The evolution in the brain. Applied Mathematics and Computation, 98(2-3):293-300, February, 1999. Kjellström, G. Evolution in a nutshell and some consequences concerning valuations. EVOLVE, ISBN 91-972936-1-X, Stockholm, 2002. Levine, D. S. Introduction to Neural & Cognitive Modeling. Laurence Erlbaum Associates, Inc., Publishers, 1991. MacLean, P. D. A Triune Concept of the Brain and Behavior. Toronto, Univ. Toronto Press, 1973. Maynard Smith, J. 1964. Group Selection and Kin Selection, Nature 201:1145-1147. Maynard Smith, J. Evolutionary Genetics. Oxford University Press, 1998. Mayr, E. What Evolution is. Basic Books, New York, 2001. Middleton, D. An Introduction to Statistical Communication Theory. McGraw-Hill, 1960. Rechenberg, I. Evolutionsstrategie. Stuttgart: Fromann - Holzboog, 1973. Reif, F. Fundmentals of Statistical and Thermal Physics. McGraw-Hill, 1985. Ridley, M. Evolution. Blackwell Science, 1996. Shannon, C. E. A Mathematical Theory of Communication, Bell Syst. Techn. J., Vol. 27, pp 379-423, (Part I), 1948. Stehr, G. On the Performance Space Exploration of Analog Integrated Circuits. Technischen Universität Munchen, Dissertation 2005. Taxén, L. A Framework for the Coordination of Complex Systems’ Development. Institute of Technology, Linköping University, 2003. Zohar, D. The quantum self : a revolutionary view of human nature and consciousness rooted in the new physics. London, Bloomsbury, 1990 Åslund, N. The fundamental theorems of information theory (Swedish). Nordisk Matematisk Tidskrift, Band 9, Oslo 1961. Last edited by gregor; 08-08-2008 at 10:07 AM.. Reason: Automerged Doublepost |
|
08-08-2008, 10:28 AM | #4 (permalink) |
Super Moderator
Location: essex ma
|
nice.
this should be hours of fun. thanks very much. i've been doing alot of research in complex dynamical systems modelling---and ontology of levels, so there's some overlap with this. hard to have a quick opinion, though, so it make take a bit to get back to the thread. but yeah, this is great.
__________________
a gramophone its corrugated trumpet silver handle spinning dog. such faithfulness it hear it make you sick. -kamau brathwaite |
08-15-2008, 12:01 AM | #6 (permalink) |
Upright
Location: Sweden, Stockholm
|
It may be that the last step from the triangular region of acceptability to a probability function s(x) was huge for many readers, who are – perhaps – not even familiar with the Gaussian distribution and its moment matrix, M.
The figure below shows a region of acceptability enclosed by two lines. s(x) = 1 between the lines and s(x) = 0 outside. The red cluster consists of points determined by two independently Gaussian distributed parameters x1 and x2, with variance = 1 in both parameters. After a sufficiently large number of generations with adaptation of M according to the theorem of GA, the increase in average information (entropy, disorder, diversity etcetera) may result in the green cluster. Actually, the mean fitness - i. e. the probability of becoming a parent to new individuals in the population - is the same for both red and green cluster (about 65%). The effect of this adaptation is not very salient in a 2-dimensional case, but in a high-dimensional case, the efficiency of the search process may be increased by many orders of magnitude. A pop-scientific interpretation of an increased average information is that it gives more information in the art of survival not considered by "Fisher's fundamental theorem of natural selection", see special thread. Thus, if the lines are pushed towards each other – for instance because of arms races between predator and prey – the adapted process is more able to get away from the dangerous point of intersection between the lines and extinction may hopefully be avoided. Last edited by gregor; 08-15-2008 at 12:12 AM.. |
Tags |
adaptation, average information, entropy, evolution, gaussian, gaussian adaptation, mean fitness, model |
|
|