木叶下

  • 编程算法
  • 深度学习
  • 微小工作
  • 善用软件
  • 杂记
  • 诗人远方
南国羽说
文字记录生活
  1. 首页
  2. 深度学习
  3. 正文

Natural Computation Methods for Machine Learning Note 12

2020年3月20日 4184点热度 0人点赞 0条评论

Natural Computation Methods for Machine Learning Note 12

Contents hide
1 Natural Computation Methods for Machine Learning Note 12
1.1 An evolutionary algorithm (draw loop in parallel)
1.2 The EC loop (fill in details after presenting the algorithm)
1.3 Genotypes, phenotypes and fitness
1.4 Selection
1.5 Cloning
1.6 Recombination
1.7 Crossover examples (GA)
1.8 Mutation
1.9 Back to the TSP example

From course, I learned Evolutional Computing.

Evolutionary computing (EC)

  • population method
  • Used for problems where the task is to maximize some measure of
    success
  • Same family of problems as in RL, but very different methods
  • Methods inspired by genetics, natural selection, evolution (but usually
    controlled, so it's more like breeding

img

An evolutionary algorithm (draw loop in parallel)

  1. Create a population of individuals
    a. Individual = encoding of suggested solution to the problem
  2. Evaluate all individuals (for selection)
    a. Requires interpretation of the encoding, and evaluation of its fitness

  3. Create a new population by cloning, recombination and/or mutation of individuals
    a. Selection – fit individuals more likely to affect the new population than non-fit ones

  4. Repeat from step 2 until end criterion met

    img

The EC loop (fill in details after presenting the algorithm)

img

Genotypes, phenotypes and fitness

A solution to the problem is encoded by an individual’s genotype (genome, chromosome), e.g. (in GA) a binary string.

Example: A route in the travelling salesman problem (TSP) can be represented by a binary matrix or vector:

img

A genotype’s fitness (a scalar) is evaluated by a fitness function.

  • Requires an interpretation of the genotype – the phenotype
  • It is the phenotype which is evaluated

  • genotype-phenotype mapping may be many-to-one (and one-to-many)

    img

Selection

Selection of individuals is probabilistic, based on fitness or rank.

  • Fitness selection: Select individuals with probability proportional to their
    fitness (a.k.a. Roulette wheel selection)
  • Rank selection: Select individuals with probability proportional to their
    rank (in a list, sorted w.r.t. fitness)
  • All individuals should have some probability (i.e. non-zero) of being
    selected
  • Allow reselection

Cloning

  1. Select an individual (as above)
  2. Copy the selected individual, unaltered, to the new population

Elitism: Guarantee that the most fit individual(s) are cloned (to avoid accidental
destruction of the best solutions found so far)

Recombination

  1. Select two individuals (as above)
  2. Perform a crossover of the two genotypes, creating two new
    genotypes/individuals
  3. Insert the new individual in the population

Crossover examples (GA)

img

Many more alternatives. For example, uniform crossover – randomly select from which parent to pick each bit.

Note: Some genotypes may be illegal (breaking syntactic or semantic rules). This would be very probable in the TSP example above. Cossover should be defined so that such genotypes can not be produced. [Explain why this is not strictly necessary]

Mutation

  • Noisy copy. Some part in the copy is altered at random (e.g. in GA, by flipping bits)
  • Mutation should have very low probability
    o The prime search operator in EC is (usually) recombination/crossover, not mutation! (counter example: EP)
  • Mutation roughly corresponds to exploration (stochastic action selection) in RL
  • Some authors recommend inverted selection for mutation

Back to the TSP example

Other representations may make it easier to make crossover non-destructive.

Path representation, for example (5 7 1 6 2 4 3 8), is more intuitive. It also makes genotype=phenotype.

Many suggested crossover operators for this, for example Partially Mapped Crossover (PMX), Order Crossover (OX), Cycle Crossover (CX).

Example, OX: Pick a sub-tour (bold below, like two-point crossover except that
we keep what's inbetween, instead of swapping it), then continue by picking the
rest of the cities from the the other parent, in order and excluding already visited
ones.

(5 7 1 6 2 4 3 8) ==> (4 8 1 6 2 3 7 5)
(1 7 5 4 8 2 6 3) ==> (6 2 5 4 8 3 7 1)

Video Explanation: https://www.youtube.com/watch?v=HATPHZ6P7c4

标签: Evolutional Computing machine learning Natural Computation
最后更新:2020年3月20日

Dong Wang

I am a PhD student of TU Graz in Austria. My research interests include Embedded/Edge AI, efficient machine learning, model sparsity, deep learning, computer vision, and IoT. I would like to understand the foundational problems in deep learning.

点赞
< 上一篇
下一篇 >

文章评论

razz evil exclaim smile redface biggrin eek confused idea lol mad twisted rolleyes wink cool arrow neutral cry mrgreen drooling persevering
取消回复

这个站点使用 Akismet 来减少垃圾评论。了解你的评论数据如何被处理。

文章目录
  • Natural Computation Methods for Machine Learning Note 12
    • An evolutionary algorithm (draw loop in parallel)
    • The EC loop (fill in details after presenting the algorithm)
    • Genotypes, phenotypes and fitness
    • Selection
    • Cloning
    • Recombination
    • Crossover examples (GA)
    • Mutation
    • Back to the TSP example

COPYRIGHT © 2013-2024 nanguoyu.com. ALL RIGHTS RESERVED.

Theme Kratos Made By Seaton Jiang

陕ICP备14007751号-1