Natural Computation Methods for Machine Learning Note 12

2020年3月20日 2605点热度 0人点赞 0条评论

Natural Computation Methods for Machine Learning Note 12

From course, I learned Evolutional Computing.

Evolutionary computing (EC)

  • population method
  • Used for problems where the task is to maximize some measure of
  • Same family of problems as in RL, but very different methods
  • Methods inspired by genetics, natural selection, evolution (but usually
    controlled, so it's more like breeding


An evolutionary algorithm (draw loop in parallel)

  1. Create a population of individuals
    a. Individual = encoding of suggested solution to the problem
  2. Evaluate all individuals (for selection)
    a. Requires interpretation of the encoding, and evaluation of its fitness

  3. Create a new population by cloning, recombination and/or mutation of individuals
    a. Selection – fit individuals more likely to affect the new population than non-fit ones

  4. Repeat from step 2 until end criterion met


The EC loop (fill in details after presenting the algorithm)


Genotypes, phenotypes and fitness

A solution to the problem is encoded by an individual’s genotype (genome, chromosome), e.g. (in GA) a binary string.

Example: A route in the travelling salesman problem (TSP) can be represented by a binary matrix or vector:


A genotype’s fitness (a scalar) is evaluated by a fitness function.

  • Requires an interpretation of the genotype – the phenotype
  • It is the phenotype which is evaluated

  • genotype-phenotype mapping may be many-to-one (and one-to-many)



Selection of individuals is probabilistic, based on fitness or rank.

  • Fitness selection: Select individuals with probability proportional to their
    fitness (a.k.a. Roulette wheel selection)
  • Rank selection: Select individuals with probability proportional to their
    rank (in a list, sorted w.r.t. fitness)
  • All individuals should have some probability (i.e. non-zero) of being
  • Allow reselection


  1. Select an individual (as above)
  2. Copy the selected individual, unaltered, to the new population

Elitism: Guarantee that the most fit individual(s) are cloned (to avoid accidental
destruction of the best solutions found so far)


  1. Select two individuals (as above)
  2. Perform a crossover of the two genotypes, creating two new
  3. Insert the new individual in the population

Crossover examples (GA)


Many more alternatives. For example, uniform crossover – randomly select from which parent to pick each bit.

Note: Some genotypes may be illegal (breaking syntactic or semantic rules). This would be very probable in the TSP example above. Cossover should be defined so that such genotypes can not be produced. [Explain why this is not strictly necessary]


  • Noisy copy. Some part in the copy is altered at random (e.g. in GA, by flipping bits)
  • Mutation should have very low probability
    o The prime search operator in EC is (usually) recombination/crossover, not mutation! (counter example: EP)
  • Mutation roughly corresponds to exploration (stochastic action selection) in RL
  • Some authors recommend inverted selection for mutation

Back to the TSP example

Other representations may make it easier to make crossover non-destructive.

Path representation, for example (5 7 1 6 2 4 3 8), is more intuitive. It also makes genotype=phenotype.

Many suggested crossover operators for this, for example Partially Mapped Crossover (PMX), Order Crossover (OX), Cycle Crossover (CX).

Example, OX: Pick a sub-tour (bold below, like two-point crossover except that
we keep what's inbetween, instead of swapping it), then continue by picking the
rest of the cities from the the other parent, in order and excluding already visited

(5 7 1 6 2 4 3 8) ==> (4 8 1 6 2 3 7 5)
(1 7 5 4 8 2 6 3) ==> (6 2 5 4 8 3 7 1)

Video Explanation:

Dong Wang

I will work as a PhD student of TU Graz in Austria. My research interests include Embedded/Edge AI, federated learning, computer vision, and IoT.