Tilted Forum Project Discussion Community  

Go Back   Tilted Forum Project Discussion Community > The Academy > Tilted Philosophy


 
 
LinkBack Thread Tools
Old 08-08-2008, 06:53 AM   #1 (permalink)
Upright
 
Location: Sweden, Stockholm
Gaussian adaptation as a model of evolution

Even though a lot of mathematics is needed to support the idea of Gaussian Adaptation, GA, the idea is very simple to understand. It’s in principle like trying to inscribe a circle in a triangle, which is a simple operation with a pair of compasses and a ruler.



But, imagine that we have to use some algorithm. For instance, suppose that we have a large number of circular pasteboard discs adapted to be perfectly inscribed in a certain triangle. Suppose also that we have a starting point #1 somewhere inside the triangle. Now, take a disc and place its centre of gravity on the starting point #1. Then, cut away the parts of the disc falling outside the triangle and determine the centre of gravity of the remaining part of the disc, point #2. In the next cycle of iteration, take a new disc and place its centre of gravity on point #2. Then, if this cycle is repeated many times for centers of gravity #3, #4 etcetera I am convinced that the algorithm will end up with a disc perfectly inscribed. But of course, this is not a very efficient method. And it is not a random method like evolution.

But, the algorithm is very easily transformed to a random ditto. We need only replace the circular discs by a random number generator producing random points uniformly distributed over the circular discs. Then, we need only place the centre m of the circular distribution on the starting point, generate a sufficiently large number of sample points, calculate the centre of gravity, m*, of all pass samples inside the triangle, move m to m* and repeat the iteration many times. In this case there will be a problem with the precision, because the number of samples must be limited. Even though this algorithm is not very efficient, it is more like the natural evolution in the sense that it uses random variation and selection in cyclic repetition.

In order to make it more like the natural evolution we may replace the triangle by a function s(x), 0 < s(x) < q <= 1, such that s(x) is the probability that x will be selected as a pass sample. So far nothing has been changed as long as s(x) = 1 for all x inside the triangle and = 0 outside. We may also define the average probability of finding pass samples (mean fitness) as

P(m) = integral s(x) N(m –x) dx

where N, thus far, is the uniform distribution over a disc. The same algorithm will end up with a maximum in P = 1.

Now, allow s(x) to be a probability function of an arbitrary extension and structure and replace the uniform distribution by a Gaussian distribution N(m – x) with moment matrix M. Then we have the following theorem:

For any s(x) and for any value of P < q <= 1, there always exist a Gaussian probability density function that is adapted for maximum disorder. The necessary conditions for a local maximum are m = m* and M proportional to M*. The dual problem is also solved; P is maximized while keeping M constant.

Disorder is equivalent to entropy, information entropy and average information. Its exponential is proportional to the volume of the concentration ellipsoid of the Gaussian, which will also be maximized.

To my knowledge the Gaussian distribution is unique in this context. An algorithm using a Gaussian random number generator and pushing m to m* in every generation will converge to a state of equilibrium maximizing the mean fitness. A question may to what extent it is applicable to the natural evolution? Necessary conditions seems to be that evolution may in some way produce Gaussian distributed quantitative characters and that it may push m to m* in every generation.

Last edited by gregor; 08-08-2008 at 07:00 AM..
gregor is offline  
Old 08-08-2008, 08:11 AM   #2 (permalink)
 
roachboy's Avatar
 
Super Moderator
Location: essex ma
on what basis do you assume that theories rooted in equilibrium obtain for bio-systems?
if you can't explain that, you can't do the mapping.


but the distribution is interesting on it's own--thinking about it.
so thanks.
__________________
a gramophone its corrugated trumpet silver handle
spinning dog. such faithfulness it hear

it make you sick.

-kamau brathwaite
roachboy is offline  
Old 08-08-2008, 09:57 AM   #3 (permalink)
Upright
 
Location: Sweden, Stockholm
Quote:
Originally Posted by roachboy View Post
on what basis do you assume that theories rooted in equilibrium obtain for bio-systems?
if you can't explain that, you can't do the mapping.


but the distribution is interesting on it's own--thinking about it.
so thanks.
Thank you, roachboy. The only answer I can give is that the equilibrium can only be reached if the function s(x) is constant. If it varies with time, the process may only strive to a maximum - foillowing the gradient of P - which will perhaps never be reached. Expressions of the gradients may be found in the papers of Kjellström, 1970, and Kjellström & Taxén, 1981. For references see my next post.
-----Added 8/8/2008 at 02 : 07 : 29-----
Bergström, R. M., 1969. An Entropy model of the Developing Brain. Developmental Psychobiology, 2(3): 139-152.

Bergström, M. Hjärnans resurser. Brain Books, ISBN 91-88410-07-2, Jönköping, 1992. (Swedish).

Bergström, M. Neuropedagogik. En skola för hela hjärnan. Wahlström & Widstrand, 1995. (Swedish).

Brooks, D. R. & Wiley, E. O. Evolution as Entropy, Towards a unified theory of Biology. The University of Chicago Press, 1986.

Brooks, D. R. Evolution in the Information Age: Rediscovering the Nature of the Organism. Semiosis, Evolution, Energy, Development, Volume 1, Number 1, March 2001

Cramér, H. Mathematical Methods of Statistics. Princeton, Princeton University Press, 1961.

Dawkins, R. The Selfish Gene. Oxford University Press, 1976.

Eigen, M. Steps towards life. Oxford University Press, 1992.

Gaines, Brian R. Knowledge Management in Societies of Intelligent Adaptive Agents. Journal of intelligent Information systems 9, 277-298 (1997).

Goldberg, D. E. Genetic Algorithms in Search Optimization & Machine Learning. Addison-Wesley, New York, 1989.

Hamilton, WD. 1963. The evolution of altruistic behavior. American Naturalist 97:354-356

Hartl, D. L. A Primer of Population Genetics. Sinauer, Sunderland, Massachusetts, 1981.

Kandel, E. R., Schwartz, J. H., Jessel, T. M. Essentials of Neural Science and Behavior. Prentice Hall International, London, 1995.

Kjellström, G. Network Optimization by Random Variation of component values. Ericsson Technics, vol. 25, no. 3, pp. 133-151, 1969.

Kjellström, G. Optimization of electrical Networks with respect to Tolerance Costs. Ericsson Technics, no. 3, pp. 157-175, 1970.

Kjellström, G. & Taxén, L. Stochastic Optimization in System Design. IEEE Trans. on Circ. and Syst., vol. CAS-28, no. 7, July 1981.

Kjellström, G. On the Efficiency of Gaussian Adaptation. Journal of Optimization Theory and Applications, vol. 71, no. 3, Dec. 1991.

Kjellström, G. & Taxén, L. Gaussian Adaptation, an evolution-based efficient global optimizer; Computational and Applied Mathematics, In, C. Brezinski & U. Kulish (Editors), Elsevier Science Publishers B. V., pp 267-276, 1992.

Kjellström, G. Evolution as a statistical optimization algorithm. Evolutionary Theory 11:105-117 (January, 1996).

Kjellström, G. The evolution in the brain. Applied Mathematics and Computation, 98(2-3):293-300, February, 1999.

Kjellström, G. Evolution in a nutshell and some consequences concerning valuations. EVOLVE, ISBN 91-972936-1-X, Stockholm, 2002.

Levine, D. S. Introduction to Neural & Cognitive Modeling. Laurence Erlbaum Associates, Inc., Publishers, 1991.

MacLean, P. D. A Triune Concept of the Brain and Behavior. Toronto, Univ. Toronto Press, 1973.

Maynard Smith, J. 1964. Group Selection and Kin Selection, Nature 201:1145-1147.

Maynard Smith, J. Evolutionary Genetics. Oxford University Press, 1998.

Mayr, E. What Evolution is. Basic Books, New York, 2001.

Middleton, D. An Introduction to Statistical Communication Theory. McGraw-Hill, 1960.

Rechenberg, I. Evolutionsstrategie. Stuttgart: Fromann - Holzboog, 1973.

Reif, F. Fundmentals of Statistical and Thermal Physics. McGraw-Hill, 1985.

Ridley, M. Evolution. Blackwell Science, 1996.

Shannon, C. E. A Mathematical Theory of Communication, Bell Syst. Techn. J., Vol. 27, pp 379-423, (Part I), 1948.

Stehr, G. On the Performance Space Exploration of Analog Integrated Circuits. Technischen Universität Munchen, Dissertation 2005.

Taxén, L. A Framework for the Coordination of Complex Systems’ Development. Institute of Technology, Linköping University, 2003.

Zohar, D. The quantum self : a revolutionary view of human nature and consciousness rooted in the new physics. London, Bloomsbury, 1990

Åslund, N. The fundamental theorems of information theory (Swedish). Nordisk Matematisk Tidskrift, Band 9, Oslo 1961.

Last edited by gregor; 08-08-2008 at 10:07 AM.. Reason: Automerged Doublepost
gregor is offline  
Old 08-08-2008, 10:28 AM   #4 (permalink)
 
roachboy's Avatar
 
Super Moderator
Location: essex ma
nice.

this should be hours of fun. thanks very much.

i've been doing alot of research in complex dynamical systems modelling---and ontology of levels, so there's some overlap with this. hard to have a quick opinion, though, so it make take a bit to get back to the thread.

but yeah, this is great.
__________________
a gramophone its corrugated trumpet silver handle
spinning dog. such faithfulness it hear

it make you sick.

-kamau brathwaite
roachboy is offline  
Old 08-08-2008, 09:43 PM   #5 (permalink)
Upright
 
Location: Sweden, Stockholm
Quote:
Originally Posted by roachboy View Post
nice.
I thought that the references should have a post of their own, but this will do. I will be away for some days, but will continue with GA later.
gregor is offline  
Old 08-15-2008, 12:01 AM   #6 (permalink)
Upright
 
Location: Sweden, Stockholm
It may be that the last step from the triangular region of acceptability to a probability function s(x) was huge for many readers, who are – perhaps – not even familiar with the Gaussian distribution and its moment matrix, M.

The figure below shows a region of acceptability enclosed by two lines. s(x) = 1 between the lines and s(x) = 0 outside.

The red cluster consists of points determined by two independently Gaussian distributed parameters x1 and x2, with variance = 1 in both parameters.

After a sufficiently large number of generations with adaptation of M according to the theorem of GA, the increase in average information (entropy, disorder, diversity etcetera) may result in the green cluster. Actually, the mean fitness - i. e. the probability of becoming a parent to new individuals in the population - is the same for both red and green cluster (about 65%). The effect of this adaptation is not very salient in a 2-dimensional case, but in a high-dimensional case, the efficiency of the search process may be increased by many orders of magnitude.

A pop-scientific interpretation of an increased average information is that it gives more information in the art of survival not considered by "Fisher's fundamental theorem of natural selection", see special thread.

Thus, if the lines are pushed towards each other – for instance because of arms races between predator and prey – the adapted process is more able to get away from the dangerous point of intersection between the lines and extinction may hopefully be avoided.

Last edited by gregor; 08-15-2008 at 12:12 AM..
gregor is offline  
 

Tags
adaptation, average information, entropy, evolution, gaussian, gaussian adaptation, mean fitness, model


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are On



All times are GMT -8. The time now is 11:18 AM.

Tilted Forum Project

Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2024, vBulletin Solutions, Inc.
Search Engine Optimization by vBSEO 3.6.0 PL2
© 2002-2012 Tilted Forum Project

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 330 331 332 333 334 335 336 337 338 339 340 341 342 343 344 345 346 347 348 349 350 351 352 353 354 355 356 357 358 359 360