15
Generative NeuroEvolution for Deep Learning Phillip Verbancsics & Josh Harguess Space and Naval Warfare Systems Center 2013

Neuroevolution and deep learing

  • Upload
    henoky

  • View
    441

  • Download
    3

Embed Size (px)

DESCRIPTION

Mingling evolution and learning

Citation preview

Page 1: Neuroevolution and deep learing

Generative NeuroEvolution for Deep Learning

Phillip Verbancsics & Josh HarguessSpace and Naval Warfare Systems Center

2013

Page 2: Neuroevolution and deep learing

Introduction• Evolution and lifetime learning combine to create

the capabilities of animal brain– Investigating the role of Neuro-Evolution in deep-

learning • Neuro-Evolution – To train feature extractor

• Deep-learning– Learn from the features

Page 3: Neuroevolution and deep learing

Neuro-Evolution• What is HyperNEAT (Hypercube-based NeuroEvolution of

Augmenting Topologies )?• Weights of the ANN are generated as a function of geometry

• Evolves both the topology and weight of a networks to maximize performance

Page 4: Neuroevolution and deep learing

Deep-learning • An algorithm to make ANN learn in multiple levels of representation,

corresponding to different levels of abstraction.

• CNN learns features based upon locality.

Page 5: Neuroevolution and deep learing

Main algorithm

Page 6: Neuroevolution and deep learing

Experimental set-up• MINST data set– 60,000 training and 10,000 test images (28 x28 pixel)

• 10 classes (0-9)• HyperNEAT in 4 flavors– HpyerNEAT with traditional ANN architecture– HyperNEAT with CNN architecture

• Learning to classify images by itself – ANN for image classification

• Acting as a feature learner – ANN that transform images into features

Page 7: Neuroevolution and deep learing

Experimental set-up• Architecture of HyperNEAT– A multi-layer (z-axis)– Each layer has features – Each layer is presented by a triple (X, Y, F), • F is the number of features• X, Y are the pixel dimensions.

– CPPN queried for weights of neuron connection• Neurons located at a particular (x, y, f, z) coordinate

Page 8: Neuroevolution and deep learing

Experimental set-up• HpyerNEAT with traditional ANN architecture– is a seven layer neural network • 1 input, 1 output, 5 hidden layers, • (28; 28; 1), (16; 16; 3), (8; 8; 3), (6; 6; 8), (3; 3; 8), (1; 1; 100),

(1; 1; 64), and (1; 1; 10).

– Each layer is fully connected to the adjacent layers and each neuron has a bipolar sigmoid activation function

Page 9: Neuroevolution and deep learing

Experimental set-up• HyperNEAT with CNN architecture – Replication of LeNet-5 architecture

Page 10: Neuroevolution and deep learing

Experimental set-up• To act as feature extractor – HpyerNEAT with traditional ANN architecture

• (1; 1; 100) become the new output layer

– HyperNEAT with CNN architecture • (1; 1; 120) become the new output layer

• Feature vectors are given to BP that trains the modified network • After evolution completes, the generation champions are evaluated

on the MNIST testing set (10,000)

Page 11: Neuroevolution and deep learing

Results • Results were averaged over 30 runs – of each 2500 generations – 256 population size

• Fitness score is the sum of – True positive rate– True negative rate – Positive predictive value – Negative predictive value – Accuracy for each class + fraction correctly classified – Inverse of the Mean Square Error from the correct label

output

Page 12: Neuroevolution and deep learing

Results • HpyerNEAT – non-feature extractor – Fitness is determined by applying the ANN substrate to

the training images• HpyerNEAT –feature extractor – Fitness is the testing performance of BP trained network• Trained for 250 BP iteration on 300 training images • Tested on 1000 training images

Page 13: Neuroevolution and deep learing

Results

HpyerNEAT with traditional ANN architecture

Page 14: Neuroevolution and deep learing

Results

HyperNEAT with CNN architecture

Page 15: Neuroevolution and deep learing