18
Su lab meeting March 30, 2012 Benjamin Good HUMAN GUIDED FORESTS

Human Guided Forests (HGF)

  • Upload
    goodb

  • View
    693

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Human Guided Forests (HGF)

Su lab meeting March 30, 2012

Benjamin Good

HUMAN GUIDED FORESTS

Page 2: Human Guided Forests (HGF)

1. Motivation

2. Idea

3. Game

AGENDA

Page 3: Human Guided Forests (HGF)

Need to build biological class predictors that:

1. Have high accuracy

2. Use relatively few variables

To do this we have to use datasets that:

3. Are very noisy

4. Contain enormous numbers of variables

CHALLENGE

Page 4: Human Guided Forests (HGF)

Van’tVeer 2002 Nature

98 breast cancer samples:

34 developed metastases within 5 years, 44 did not

18 had BRCA1 mutations, 2 had BRCA2 mutations

expression levels of 25,000 genes measured

5,000 genes “significantly regulated across the sample groups”

EXAMPLE: BREAST CANCER PROGNOSIS

Page 5: Human Guided Forests (HGF)

5000 genes

98 tumors

70% bad 30% bad

genes that coregulate with ER

co-regulated genes indicating lymphocytic infiltrate

Page 6: Human Guided Forests (HGF)

231 genes were found to be significantly associated with disease outcome

Using leave-one-out cross-validation they empirically selected the 70 best individual genes to build their

predictor

Of the 78 samples in the training set

the predictor correctly classified 65 (83%)

->MammaPrint test from Agendia

Still in clinical trials (10 years since original study) (MINDACT)

METASTASIS PREDICTOR

Page 7: Human Guided Forests (HGF)

This signature does not take advantage of:• interactions between genes

(together two variables may be much more predictive then either one alone)

• biological knowledge

(this signature leaves out several known cancer predictors and does not make use of biological knowledge in any way)

WE CAN DO BETTER

Page 8: Human Guided Forests (HGF)

THERE ARE MANY MANY CHALLENGES LIKE THIS IN BIOLOGY

Page 9: Human Guided Forests (HGF)

1. Motivation

2. Idea

3. Game

AGENDA

Page 10: Human Guided Forests (HGF)

The standard signature does not take advantage of:• interactions between genes

machine learning algorithms can find and use these but can have problems when faced with large feature

spaces• biological knowledge

can be used to guide the machine learning process towards meaningful features in the data and thus

reduce chances of overfitting

WE CAN DO BETTER BY INTEGRATING MACHINE LEARNING WITH BIOLOGICAL

EXPERTISE

Page 11: Human Guided Forests (HGF)

MACHINE LEARNING ALGORITHM OF THE

MOMENT

• In each of many iterations, a small subset of features are chosen randomly and used to build one decision tree

• Decision trees are stored and classifications are made based on the majority vote of all of the trees.

• Good classifier!

• But you get different forests every time you run it and it faces the same challenges of generalizability as any other learning algorithm.

Page 12: Human Guided Forests (HGF)

NETWORK GUIDED FOREST (NGF)

Same algorithm except each tree is constructed from a particular area of a relevant protein-protein interaction network.

1) Pick a gene randomly2) Walk out along the

network to get the other N genes to use to build that tree

3) repeat

The premise is that biologically coherent modules will give better signal than individual genes randomly grouped together

Dutkowski & Ideker (2011) Protein Networks as Logic Functions in Development in Development and Cancer. PLoS Computational Biology

Page 13: Human Guided Forests (HGF)

NGF RESULTS

A) Identical performance to random forest and random network guided forest as assessed by 5 fold cross-validation repeated 100 times.

B) More known breast cancer genes show up in the forest

C) Similar genes selected for forests in two different training sets (different patient cohorts)

Page 14: Human Guided Forests (HGF)

HUMAN GUIDED RANDOM FOREST (HGF)

Same algorithm again except each trees are constructed from a manually selected subset of genes (or other features).

1) Find a person2) Let them select what

they think is an optimal feature set

3) back to step one, N times

4) aggregate

The premise is that biological knowledge can produce better than random decision modules and that not all biological knowledge is captured in interaction networks

Page 15: Human Guided Forests (HGF)

1) Find a person

2) Let them select what they think is an optimal feature set

3) back to step one, N times

• N may be large (e.g. 1,000)• Need many knowledgeable people to work hard... for free

HGF CHALLENGES

Page 16: Human Guided Forests (HGF)

1. Motivation

2. Idea

3. Game

AGENDA

Page 17: Human Guided Forests (HGF)

COMBO! COMPONENTS

Java server• hosts training

data• executes

decision tree algorithm

• performs cross-validation tests

• logs data generated by game

Game interface(s)• manages game

events: users, moves, etc.

• two implementations so far • server side

(JSP) card game• client side

(javascript) table game

server provides features for games (e.g. gene cards)

client sends groups of features (‘hands’) to the server for scoring.

Page 18: Human Guided Forests (HGF)

COMBO CODE AND DEMO

Next steps

1. Better preprocessing of training data

• map contigs to genes where possible, filter out clearly useless genes

• identify individually predictive genes

• Game

• build domain-specific boards, let players pick their knowledge area

• Real two player feeling with robot partner

• Special cards: robber card, any-gene selector card

• High scores

• ??????????????