Pso Slides

Embed Size (px)

Citation preview

  • 8/12/2019 Pso Slides

    1/39

    Particle Swarm OptimizationApplications in Parameterization of Classifiers

    James Blondin

    [email protected]

    Armstrong Atlantic State University

    Particle Swarm Optimization p. 1

  • 8/12/2019 Pso Slides

    2/39

    Outline Introduction

    Canonical PSO Algorithm

    PSO Algorithm Example

    Classifier Optimization Conclusion

    Particle Swarm Optimization p. 2

  • 8/12/2019 Pso Slides

    3/39

    Particle Swarm Optimization

    Particle Swarm Optimization (PSO) is a

    swarm-intelligence-based

    approximate

    nondeterministic

    optimization technique.

    Particle Swarm Optimization p. 3

  • 8/12/2019 Pso Slides

    4/39

    Optimization Techniques

    Optimization techniques find the parameters that

    provide the maximum (or minimum) value of atarget function.

    0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5

    4

    3

    2

    1

    0

    1

    2

    3

    4

    Particle Swarm Optimization p. 4

  • 8/12/2019 Pso Slides

    5/39

    Uses of Optimization

    In the field of machine learning, optimization

    techniques can be used to find the parametersfor classification algorithms such as:

    Artificial Neural Networks Support Vector Machines

    These classification algorithms often require theuser to supply certain coefficients, which oftenhave to be found by trial and error or exhaustive

    search.Particle Swarm Optimization p. 5

  • 8/12/2019 Pso Slides

    6/39

    Canonical PSO Algorithm Introduction

    Canonical PSO Algorithm

    PSO Algorithm Example

    Classifier Optimization Conclusion

    Particle Swarm Optimization p. 6

  • 8/12/2019 Pso Slides

    7/39

    Origins of PSO

    PSO was first described by James Kennedy and

    Russell Eberhart in 1995.

    Derived from two concepts:

    The observation of swarming habits ofanimals such as birds or fish

    The field of evolutionary computation (suchas genetic algorithms)

    Particle Swarm Optimization p. 7

  • 8/12/2019 Pso Slides

    8/39

    PSO Concepts The PSO algorithm maintains multiple

    potential solutions at one time During each iteration of the algorithm, each

    solution is evaluated by an objective functionto determine its fitness

    Each solution is represented by a particle in

    the fitness landscape (search space) The particles fly or swarm through the

    search space to find the maximum value

    returned by the objective functionParticle Swarm Optimization p. 8

  • 8/12/2019 Pso Slides

    9/39

    Fitness Landscape

    0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 54

    3

    2

    1

    0

    1

    2

    3

    Solution

    Fitness

    Particles

    Particle Swarm Optimization p. 9

  • 8/12/2019 Pso Slides

    10/39

  • 8/12/2019 Pso Slides

    11/39

    Canonical PSO Algorithm

    The PSO algorithm consists of just three steps:

    1. Evaluate fitness of each particle

    2. Update individual and global bests

    3. Update velocity and position of each particle

    These steps are repeated until some stopping

    condition is met.

    Particle Swarm Optimization p. 11

  • 8/12/2019 Pso Slides

    12/39

    Velocity Update

    Each particles velocity is updated using this

    equation:

    vi(t+1) =wvi(t)+c1r1[xi(t)xi(t)]+c2r2[g(t)xi(t)]

    iis the particle index

    wis the inertial coefficient

    c1, c2 are acceleration coefficients,0 c1, c2 2

    r1, r2 are random values (0 r1, r2 1)regenerated every velocity update

    Particle Swarm Optimization p. 12

  • 8/12/2019 Pso Slides

    13/39

    Velocity Update

    Each particles velocity is updated using this

    equation:

    vi(t+1) =wvi(t)+c1r1[xi(t)xi(t)]+c2r2[g(t)xi(t)]

    vi(t)is the particles velocity at time t

    xi(t)is the particles position at time t

    xi(t)is the particles individual best solutionas of timet

    g(t)is the swarms best solution as of time t

    Particle Swarm Optimization p. 13

  • 8/12/2019 Pso Slides

    14/39

    Velocity Update Inertia Component

    vi(t+1) =wvi(t)+c1r1[xi(t)xi(t)]+c2r2[g(t)xi(t)]

    Keeps the particle moving in the same

    direction it was originally heading Inertia coefficientwusually between 0.8 and

    1.2

    Lower values speed up convergence, highervalues encourage exploring the search space

    Particle Swarm Optimization p. 14

  • 8/12/2019 Pso Slides

    15/39

    Velocity Update Cognitive Component

    vi(t+1) =wvi(t)+c1r1[xi(t) xi(t)]+c2r2[g(t)xi(t)]

    Acts as the particles memory, causing it to

    return to its individual best regions of thesearch space

    Cognitive coefficientc1 usually close to 2

    Coefficient limits the size of the step theparticle takes toward its individual best xi

    Particle Swarm Optimization p. 15

  • 8/12/2019 Pso Slides

    16/39

    Velocity Update Social Component

    vi(t+1) =wvi(t)+c1r1[xi(t)xi(t)]+c2r2[g(t) xi(t)]

    Causes the particle to move to the best

    regions the swarm has found so far Social coefficientc2usually close to 2

    Coefficient limits the size of the step theparticle takes toward the global best g

    Particle Swarm Optimization p. 16

  • 8/12/2019 Pso Slides

    17/39

    Position UpdateEach particles position is updated using this

    equation:

    xi(t+ 1) =xi(t) +vi(t+ 1)

    Particle Swarm Optimization p. 17

  • 8/12/2019 Pso Slides

    18/39

    PSO Algorithm Example Introduction

    Canonical PSO Algorithm

    PSO Algorithm Example

    Classifier Optimization Conclusion

    Particle Swarm Optimization p. 18

  • 8/12/2019 Pso Slides

    19/39

    PSO Algorithm ReduxRepeat until stopping condition is met:

    1. Evaluate fitness of each particle

    2. Update individual and global bests

    3. Update velocity and position of each particle

    Particle Swarm Optimization p. 19

  • 8/12/2019 Pso Slides

    20/39

    Fitness Evaluation (t=1)

    0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 54

    3

    2

    1

    0

    1

    2

    3

    Particle Swarm Optimization p. 20

  • 8/12/2019 Pso Slides

    21/39

    Update Individual / Global Bests (t=1)

    0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 54

    3

    2

    1

    0

    1

    2

    3

    Particle Swarm Optimization p. 21

  • 8/12/2019 Pso Slides

    22/39

    Update Velocity and Position (t=1)vi(t+1) =wvi(t)+c1r1[xi(t)xi(t)]+c2r2[g(t)xi(t)]

    0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 54

    3

    2

    1

    0

    1

    2

    3

    Particle Swarm Optimization p. 22

  • 8/12/2019 Pso Slides

    23/39

    Fitness Evaluation (t=2)

    0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 54

    3

    2

    1

    0

    1

    2

    3

    Particle Swarm Optimization p. 23

  • 8/12/2019 Pso Slides

    24/39

    Update Individual / Global Bests (t=2)

    0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 54

    3

    2

    1

    0

    1

    2

    3

    Particle Swarm Optimization p. 24

  • 8/12/2019 Pso Slides

    25/39

    Update Velocity and Position (t=2)vi(t+1) =wvi(t)+c1r1[xi(t)xi(t)]+c2r2[g(t)xi(t)]

    0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 54

    3

    2

    1

    0

    1

    2

    3

    Inertia YellowSocial CyanCognitive Color of ParticleTotal Black

    Particle Swarm Optimization p. 25

  • 8/12/2019 Pso Slides

    26/39

    Classifier Optimization Introduction

    Canonical PSO Algorithm

    PSO Algorithm Example

    Classifier Optimization Conclusion

    Particle Swarm Optimization p. 26

  • 8/12/2019 Pso Slides

    27/39

    Support Vector MachinesSupport Vector Machines (SVMs) are a group of

    machine learning techniques used to classifydata.

    Effective at classifying even non-lineardatasets

    Slow to train

    When being trained, they require thespecification of parameters which can greatlyenhance or impede the SVMs effectiveness

    Particle Swarm Optimization p. 27

  • 8/12/2019 Pso Slides

    28/39

    Support Vector Machine ParametersOne specific type of SVM, a Cost-based Support

    Vector Classifier (C-SVC), requires twoparameters:

    Cost parameter (C), which is typicallyanywhere between25 and220

    Gamma parameter (), which is typically

    anywhere between220

    and23

    Particle Swarm Optimization p. 28

  • 8/12/2019 Pso Slides

    29/39

    Supper Vector Machine ParametersFor different datasets, the optimal values for

    these parameters can be very different, even onthe same type of C-SVC.

    To find the optimal parameters, two approachesare often used:

    Random selection

    Grid search

    Particle Swarm Optimization p. 29

  • 8/12/2019 Pso Slides

    30/39

    Financial Data

    Particle Swarm Optimization p. 30

  • 8/12/2019 Pso Slides

    31/39

    DNA Splicing Data

    Particle Swarm Optimization p. 31

  • 8/12/2019 Pso Slides

    32/39

  • 8/12/2019 Pso Slides

    33/39

    Applying PSO to SVM ParametersAlternatively, PSO can be used to parameterize

    SVMs, using the SVM training run as theobjective function.

    Implementation considerations: Finding maximum among two dimensions (as

    opposed to just one, as in the example)

    Parameters less than zero are invalid, soposition updates should not move parameterbelow zero

    Particle Swarm Optimization p. 33

  • 8/12/2019 Pso Slides

    34/39

    PSO ParametersParameters used for PSO algorithm:

    Number of particles: 8

    Inertia coefficient (w): .75

    Cognitive coefficient(c1): 1.8

    Social coefficient(c2): 2

    Number of iterations: 10 (or no improvementfor 4 consecutive iterations)

    Particle Swarm Optimization p. 34

  • 8/12/2019 Pso Slides

    35/39

    Preliminary Results

    Dataset

    DNA Splicing Financial

    Grid search

    Num. Training Runs 110 144Max. Accuracy 95.7% 77.8%

    PSO

    Num. Training Runs 56 72Max. Accuracy 96.1% 77.7%

    Particle Swarm Optimization p. 35

  • 8/12/2019 Pso Slides

    36/39

  • 8/12/2019 Pso Slides

    37/39

    Conclusion Introduction

    Canonical PSO Algorithm PSO Algorithm Example

    Classifier Optimization Conclusion

    Particle Swarm Optimization p. 37

  • 8/12/2019 Pso Slides

    38/39

    Conclusions and Future WorkConclusions:

    Significant speedup using PSO overexhaustive search

    Additional testing needed

    Future Work:

    Other PSO variants can be tried Need to find optimal parameters for PSO itself

    Particle Swarm Optimization p. 38

  • 8/12/2019 Pso Slides

    39/39

    Questions

    Questions?

    Particle Swarm Optimization p. 39