Upload
zachary-corcoran
View
215
Download
1
Tags:
Embed Size (px)
Citation preview
Evolutionary Algorithms
Lehrstuhl für Informatik 2
Gabriella Kókai: Maschine Learning
Overview
➔ Motivation Nature as a Standard Genetic Algorithms Genetic Programming Evolutionary Strategies Conclusion
Lehrstuhl für Informatik 2
Gabriella Kókai: Maschine Learning
Motivation Since millions of years creatures populate Earth By changes in the biosphere there are again and again new environmental
conditions Populations had to learn to adapt to the new conditions; permanent stepwise
development, few stagnancy Organisms are optimally adapted with respect to their needs Nature has its own laws, rules, strategies, and mechanisms
' Evolution: successful, robust mechanism, allows creatures over generations to adapt to environmental conditions
' Goal of evolution is not predefined; optimisation, innovation, creativity
' Selection factors: competition, food supply, enemies, climate, environment, via human beings additionally breed,
Lehrstuhl für Informatik 2
Gabriella Kókai: Maschine Learning
Overview
Motivation➔ Nature as a Standard Genetic Algorithms Genetic Programming Evolutionary Strategies Conclusion
Lehrstuhl für Informatik 2
Gabriella Kókai: Maschine Learning
Nature as a Standard - Evolution, Genome
Lamarck's thesis (1809): Adaptation, urge to perfection (by specific needs) spontaneous creations, heredity of acquired characteristics (somatic induction) -> no feedback in genome
Darwin's thesis (1859): permanent evolution, common descent, multiplication of species, gradual change, natural selection, descending of characteristics with modificationBasic conditions: too rich production of genetic variations, limitation of resources (competition)Fitness: suitability, result of multiple interactions with selection factors of the environment
Lehrstuhl für Informatik 2
Gabriella Kókai: Maschine Learning
Nature as a Standard - Evolution, Genome
Gene: functional unit, relative short segment of DNA, information how to build a protein molecule
Gene-pool: sum of all genotype-variants of a population Genotype: all the genes (genome), generally structures, contain
information, instructions to define individual characteristics Phenotype: interpretation of the genes, expression of the genome as
individual characteristics, competes with other phenotypes for reproductive success in a specific setting (basic conditions of the environment) => selection filter
Lehrstuhl für Informatik 2
Gabriella Kókai: Maschine Learning
Overview Motivation Nature as a Standard➔ Genetic Algorithms
Classic Always Algorithm Selection Representation of Hypothesis Genetic Operators Procedure of Evolution Schema Theorem Applications
Genetic Programming Evolutionary Strategies Conclusion
Lehrstuhl für Informatik 2
Gabriella Kókai: Maschine Learning
Genetic Algorithms John H. Holland 1975 ; David E. Goldberg 1986 Goal of optimisation, "generate-and-test beam search" Variability (Heterogenity of the characteristics, singleness, variety) Differential fitness (propagation rate depends on the ability to survive
in a specific setting, to reproduce descendants) Heritable fitness (circulate the
genome, incomplete copy, by mixture of different descendants)
Dualism Genotype/Phenotype
Lehrstuhl für Informatik 2
Gabriella Kókai: Maschine Learning
Classic Always Algorithm
πη
τσ
η π s
s P 0
τ t,P t , η π s
P' = σ P t ,η P t
η P t
Coding, structures representation of hypothesis and individuals Interpretation function what does the coding represent? Fitness function shall be optimised Termination criteria is the optimum approximately reached? Selection function which individuals determine the next population? Initialise: generate randomly n individuals for the initial population P(0) Evaluate: determine for all t := 0 Generation 0 while not Selection: choose stochastically individuals according to their fitness Crossover: create children via the recombination of parental individuals from P' Mutation: change randomly the representation of child individuals from P' Update: put n, randomly picked child individuals from P' to P(t+1) t := t + 1 increment generation Evaluate return Individual with highest fitness value
Lehrstuhl für Informatik 2
Gabriella Kókai: Maschine Learning
Representation of Hypothesis
CodingRepresentation of the parameters (hypothesis, individuals) to be optimised by structures over a discrete alphabet, mostly bit-strings s = (0100111101) s = (atggcaact) with alphabet A = {a, t, g, c}
InterpretationMapping p from the genotypical structure space into the phenotypical characteristics and behaviour space
Production systems = (10 01 1 11 10 0) : IF a1=T & a2=F THEN c=T ; IF a2=T THEN c=F
Triplet : amino acid
Lehrstuhl für Informatik 2
Gabriella Kókai: Maschine Learning
Selection Best fitting individuals shall build a descendant
population, one-sidedness shall be avoided by stochastic selection algorithms
Fitness-proportional selection: Roulette algorithm
proportional to their own fitness, indirectly proportional to competitors. Problem: Super individuals may dominate too much
Rank-based selection: Individuals are sorted ascendingly according to their fitness; selection is done by a roulette algorithm based on the position in this ranking
Lehrstuhl für Informatik 2
Gabriella Kókai: Maschine Learning
Genetic Operators
MutationMutation probability, uniformly distributed random number
With a discrete alphabet and a maximal mutation distance, to limit variation. Defining the distance measure of the alphabet
P = 0.50110011010 --> 0101010001 bit-wise cgeehadcdhh --> chdcgadcdfh Mutation distance 2 (lexicographic)
X
Lehrstuhl für Informatik 2
Gabriella Kókai: Maschine Learning
Genetic Operators (2)
Multipoint analogous, e.g. uniformly or odd-even: 0110011010 0011001111 Mask: 1212121212 1011001111 1110011010
Multi-recombination (more than 2 parental chromosomes): random selection of 2 parents, as above several parents
0110011010 \ 1011001111 0100000001 - 0110010001 Mask: 3311144443 1101110000 /
X
Lehrstuhl für Informatik 2
Gabriella Kókai: Maschine Learning
Genetic Operators (2) Recombination (Crossover)
Crossing point(s) randomly determined or by a fixed mask Single-point:
0110011010 0111001111 Crossing point: 3 1011001111 1010011010 Mask: 1112222222
Dual-point: bbafdeacca bbabacacca Crossing points: 3, 6 edebacbfbb edefdebfbb Mask: 1112221111
X
X
Lehrstuhl für Informatik 2
Gabriella Kókai: Maschine Learning
Genetic Operators (3)
Inversion mirrored (generally: permuted) insertion of the middle part 1010011101 --> 1010100001 inverted fgbbcdadace --> fgbbcdcdaea permuted
Deletionloss of an arbitrary part 1010011101 --> 101001 intercalar fgbbcdadace --> fgbbcda terminal
Duplication duplication of an arbitrary part 1010011101 --> 1010011011101 fgbbcdadace --> fgbbcdadacedace
Lehrstuhl für Informatik 2
Gabriella Kókai: Maschine Learning
Procedure of Evolution Example: global maximum of a (multi-modal) function
Bit-vectors: Interpretation: as in the example Evaluation: compute the function at the interpreted location
5 (3) Populations independently of each other Strategies: population size, recombination partner, create descendants,
mutate Plus selection from parents and mutated children Comma selection from mutated children, individuals
survive at most one generation Variants
x y x,1 x,kx y,1 y,kys = s ,s = s , ,s ,s , s ,
η s = f π s
μ / ρ + λ
ρμ
μ / ρ,λ
λ
Lehrstuhl für Informatik 2
Gabriella Kókai: Maschine Learning
Schema Theorem
Schema: word over alphabet A* Instance: all words which are equal to at the fixed positions Order: o(H) number of fixed elements Defining length: segment length between the outermost fixed positions e.g. A = {a,b,c}, = (b, *, c, a, *, *) ; o( ) = 3 , = 4 - 1 = 3
Instances: (b, a, c, a, a, a) , (b, b, c, a, b, c) , (b, c, c, a, c, a) Premises: infinite large population, single-point-crossover, punctual mutation Which templates survive (stay instances of the schema)?
exponential propagation, if Selection: more than average fitness Recombination: short defining length Mutation: few fixed positions
As compact as possible conglomeration of gene groups, which are responsible for the increased fitness: building blocks
AH
Aδ H
AI H
AH AH Aδ H
AH
Lehrstuhl für Informatik 2
Gabriella Kókai: Maschine Learning
Applications
Can be run easily in parallel In combination with gradient algorithms (hill-climbing): Maximum search for rough
restriction of the search space Simulation of living cells Production system as an extension to expert systems Planning optimisation (storage, production processes, ...) Optimal game strategies Travelling-Salesman-Problem: structure contains indices of the nodes in visiting
order. To visit each node exactly once: modification of the genetic operators Evolution of the structure of neural nets: representation organised in segments
depending on the number of output-neurones; codes the number of layers, hidden neurons and according weights
Lehrstuhl für Informatik 2
Gabriella Kókai: Maschine Learning
Overview
Motivation Nature as a Standard Genetic Algorithms➔ Genetic Programming
Representation of the Hypothesis Differences to GA Applications
Evolutionary Strategies Conclusion
Lehrstuhl für Informatik 2
Gabriella Kókai: Maschine Learning
Genetic Programming
John R. Koza, 1989 Further development of the idea of genetic algorithms Genetic creation and optimisation of computer programs for special
problem-areas Representation of the Hypothesis Differences to GA Applications
Lehrstuhl für Informatik 2
Gabriella Kókai: Maschine Learning
Representation of the Hypothesis Computer program as tree structure (like parse-tree, LISP-Syntax) Combining elements: definition of terms and functions
Arithmetic expressions: {PLUS2, MINUS2, MULT2, DIV2} Functions: {SIN1, COS1, EXP2, LOG2, ...} Relations, conditional statement: {LESS2, EQUAL2, IF-THEN-ELSE3, ...} Problem related: {TURN-LEFT, PICK-UP, MOVE-RANDOM, ...} Tree structure: IF-THEN-ELSE
LESS MULT ADD A B A C B C LISP-Syntax: ( IF-THEN-ELSE ( LESS A B ) ( MULT A C ) ( ADD B C ) )
Closed under composition Complete according to the problem to be solved
Lehrstuhl für Informatik 2
Gabriella Kókai: Maschine Learning
Differences to GA Recombination
Exchange arbitrarily chosen sub-trees
Random determination of the crossing points
Even with identical terms mostly new structure pairs
Both children survive
Mutation Substitution of a sub-tree by a newly
generated sub-tree Random selection of a node Substitution by a randomly new
term which is correctly generated out of building blocks
Lehrstuhl für Informatik 2
Gabriella Kókai: Maschine Learning
Differences to GA
Recombination, Mutation: context-sensitive variation Selection: matches the algorithmic solution of the given problem Formulation as fitness value, e.g. Distance measure for numeric problems Successfully solved / identified cases Copy operator: copies a GP-chromosome unchanged into the next
generation Each genome is only modified by a single operator: selection
between operators Extension of terms to symbolic expressions
Lehrstuhl für Informatik 2
Gabriella Kókai: Maschine Learning
Genetic Programming (Example) Minimise a Boolean function
24
00000 1 01000 1 10000 0 11000 000001 0 01001 0 10001 0 11001 100010 0 01010 1 10010 1 11010 100011 0 01011 1 10011 0 11011 100100 0 01100 1 10100 0 11100 000101 0 01101 1 10101 0 11101 100110 1 01110 0 10110 0 11110 0
Lehrstuhl für Informatik 2
Gabriella Kókai: Maschine Learning
Genetic Programming(Example Results) 1000 individuals, 1000 steps (8 minutes) starting length: 127, results 71, 57
Lehrstuhl für Informatik 2
Gabriella Kókai: Maschine Learning
Applications
Ant searching for food on a minimal route Classification of groups belonging together for complex areas, e.g.
swallowed spirals Robots searching for objects and doing precisely oriented moves Robots following walls Random number generator with a distribution as uniformly as possible Backwards docking of a truck with its hanger Steering a robot arm with two joints to points in a field Design of electronic circuits for analogous filters
Lehrstuhl für Informatik 2
Gabriella Kókai: Maschine Learning
Overview
Motivation Nature as a Standard Genetic Algorithms Genetic Programming➔ Evolutionary Strategies
Idea, basic Principles Differences to GA Applications
Conclusion
Lehrstuhl für Informatik 2
Gabriella Kókai: Maschine Learning
Evolutionary Strategies
Ingo Rechenberg, 1964 / 1994 Adaptation of the basic mechanisms of natural evolution to
technical optimisation problems by engineering sciences Root: evolutionary experimental methods, focussed on the
physical experiment Results of the (at that time) unorthodox methods could not be
analytically founded or reproduced Idea, basic principles Differences to GA Applications
Lehrstuhl für Informatik 2
Gabriella Kókai: Maschine Learning
Development, Idea
Given: experimental equipment with variable parameters Mechanic: changing position by pitch and angle Elastic: outline by bending Combination of segments of different sizes Random change of the parameters in a certain area (mostly
binomially distributed: little mutation prefered) Measuring the experimental result: if getting worse then back
propagation of the changes Repeat until optimum is found Representation: Parameter as real-valued vector Original experiment: orthogonal pipe redirection with smallest loss
1, ng = p ,p
Lehrstuhl für Informatik 2
Gabriella Kókai: Maschine Learning
Development, Idea (2)
Lehrstuhl für Informatik 2
Gabriella Kókai: Maschine Learning
Differences to GA Algorithmic representable, expandable to populations / several descendants Representation expanded by strategy parameters: Describe variance for controlling the mutation spreading of the appropriate
parameter, can be integrated in the optimum search (adaptation of the increment)
Real-valued structures: adaptation of the genetic operators Mutation: numeric deviation ; Gauss distributed
random number, average 0, variance Recombination: discrete (randomly copied from the one or the other parent
chromosome), intermediary (average building), local (single individuals), global (whole population)
Random selection, no proportionality of the fitness Surplus of descendants, selection of the best for succeeding population
1, n 1, ng = p ,p , s ,s
i i 0 ip' = p + N s 0N
si
is
Lehrstuhl für Informatik 2
Gabriella Kókai: Maschine Learning
Evolutionary Strategies (Example)
Fluid storage Changeable shape
● Fixed volume● Minimal surface
Lehrstuhl für Informatik 2
Gabriella Kókai: Maschine Learning
Evolutionary Strategies(Example Results)
100 individuals 100 generations
Lehrstuhl für Informatik 2
Gabriella Kókai: Maschine Learning
Applications Pressure output of a two phases supersonic cone, segments with variable
diameter
Flow resistance of a joint plate, 5 joints with 51 engaging levels (0, +, -) each Rotation body form with little flow resistance, air plane ... dolphin spindle Minimal weight construction of a bow bridge Flexion of a lens for concentration on focus Magic square: 30x30 with magic sum 13525 Networking with minimal lengths and a given branching degree
Lehrstuhl für Informatik 2
Gabriella Kókai: Maschine Learning
Conclusion Evolutionary algorithms solve optimisation problems Standard is the natural evolution, which produces permanently new and partly
improved organisms, which must assert themselves in their environment Basis is the biological adaptation as a learning procedure of populations of natural
organisms Hypotheses are interpreted and evaluated by a fitness function The hypothesis room is explored by a stochastic search: Selection as fitness
proportional procedure New hypotheses come up by recombination and mutation, similar to the chromosomes
of organisms The representation can be done by bit-strings/character-strings (GA), programs as term
and function trees (GP) or real-valued parameter vectors (ES) The convergence of the algorithms is mostly very good, but not guaranteed They work also with complex problems, where other algorithms have failed on or are
not (yet) known