Selection (genetic algorithm) explained

Selection is the stage of a genetic algorithm or more general evolutionary algorithm in which individual genomes are chosen from a population for later breeding (e.g., using the crossover operator). Selection mechanisms are also used to choose candidate solutions (individuals) for the next generation. Retaining the best individuals in a generation unchanged in the next generation, is called elitism or elitist selection. It is a successful (slight) variant of the general process of constructing a new population.

A selection procedure for breeding used early on[1] may be implemented as follows:

  1. The fitness values that have been computed (fitness function) are normalized, such that the sum of all resulting fitness values equals 1.
  2. Accumulated normalized fitness values are computed: the accumulated fitness value of an individual is the sum of its own fitness value plus the fitness values of all the previous individuals; the accumulated fitness of the last individual should be 1, otherwise something went wrong in the normalization step.
  3. A random number R between 0 and 1 is chosen.
  4. The selected individual is the first one whose accumulated normalized value is greater than or equal to R.

For many problems the above algorithm might be computationally demanding. A simpler and faster alternative uses the so-called stochastic acceptance.

If this procedure is repeated until there are enough selected individuals, this selection method is called fitness proportionate selection or roulette-wheel selection. If instead of a single pointer spun multiple times, there are multiple, equally spaced pointers on a wheel that is spun once, it is called stochastic universal sampling.Repeatedly selecting the best individual of a randomly chosen subset is tournament selection. Taking the best half, third or another proportion of the individuals is truncation selection.

There are other selection algorithms that do not consider all individuals for selection, but only those with a fitness value that is higher than a given (arbitrary) constant. Other algorithms select from a restricted pool where only a certain percentage of the individuals are allowed, based on fitness value.

Methods of selection (evolutionary algorithm)

The listed methods differ mainly in the selection pressure,[2] which can be set by a strategy parameter in the rank selection described below. The higher the selection pressure, the faster a population converges against a certain solution and the search space may not be explored sufficiently. For more selection methods and further detail see.[3] [4]

Roulette wheel selection

In the roulette wheel selection, the probability of choosing an individual for breeding of the next generation is proportional to its fitness, the better the fitness is, the higher chance for that individual to be chosen.Choosing individuals can be depicted as spinning a roulette that has as many pockets as there are individuals in the current generation, with sizes depending on their probability.Probability of choosing individual

i

is equal to

pi=

fi
N
\Sigmafj
j=1
, where

fi

is the fitness of

i

and

N

is the size of current generation (note that in this method one individual can be drawn multiple times).

Rank selection

In rank selection, the selection probability does not depend directly on the fitness, but on the fitness rank of an individual within the population. This puts large fitness differences into perspective; moreover, the exact fitness values themselves do not have to be available, but only a sorting of the individuals according to quality.

Linear ranking, which goes back to Baker, is often used. It allows the selection pressure to be set by the parameter

sp

, which can take values between 1.0 (no selection pressure) and 2.0 (high selection pressure). The probability

P

for rank positions

Ri

is obtained as follows:

P(Ri)=

1l(sp-(2sp-2)
n
i-1
n-1

r)1\leqi\leqn,1\leqsp\leq2withP(Ri)\ge0,

nP(R
\sum
i)=1
In addition to the adjustable selection pressure, an advantage of rank-based selection can additionally be seen in the fact that it also gives worse individuals a chance to reproduce and thus to improve. This can be particularly helpful in applications with restrictions, since it facilitates the overcoming of a restriction in several intermediate steps, i.e. via a sequence of several individuals rated poorly due to restriction violations.

Steady state selection

In every generation few chromosomes are selected (good - with high fitness) for creating a new offspring. Then some (bad - with low fitness) chromosomes are removed and the new offspring is placed in their place. The rest of population survives to new generation.

Tournament selection

Tournament selection is a method of choosing the individual from the set of individuals. The winner of each tournament is selected to perform crossover.

Elitist selection

Often to get better results, strategies with partial reproduction are used. One of them is elitism, in which a small portion of the best individuals from the last generation is carried over (without any changes) to the next one.

Boltzmann selection

In Boltzmann selection, a continuously varying temperature controls the rate of selection according to a preset schedule. The temperature starts out high, which means that the selection pressure is low. The temperature is gradually lowered, which gradually increases the selection pressure, thereby allowing the GA to narrow in more closely to the best part of the search space while maintaining the appropriate degree of diversity.[5]

See also

External links

Notes and References

  1. Book: Holland, John H. . Adaptation in natural and artificial systems . 1992 . MIT Press . PhD thesis, The University of Michigan, 1975 . 0-585-03844-9 . Cambridge, Mass. . 42854623.
  2. Book: Back, T. . Proceedings of the First IEEE Conference on Evolutionary Computation. IEEE World Congress on Computational Intelligence . Selective pressure in evolutionary algorithms: A characterization of selection mechanisms . 1994 . https://ieeexplore.ieee.org/document/350042 . Orlando, FL, USA . IEEE . 57–62 . 10.1109/ICEC.1994.350042 . 978-0-7803-1899-1. 195867383 .
  3. Book: Eiben . A.E. . Introduction to Evolutionary Computing . Smith . J.E. . 2015 . Springer . 978-3-662-44873-1 . Natural Computing Series . Berlin, Heidelberg . 79–98 . Fitness, Selection, and Population Management . 10.1007/978-3-662-44874-8. 20912932 .
  4. Book: De Jong, Kenneth A. . Evolutionary computation : a unified approach . 2006 . MIT Press . 978-0-262-25598-1 . Cambridge, Mass. . 69652176.
  5. Book: Sivanandam, S. N.. Principles of soft computing. 2013. Wiley. Deepa, S. N.. 978-1-118-54680-2. New Delhi. 891566849.