27
ECE-175 Elements of Machine Intelligence Nuno Vasconcelos ECE Department, UCSD

ECE-175 Elements of Machine Intelligence · ECE-175 Elements of Machine Intelligence Nuno Vasconcelos ECE Department, UCSD. 2 The course The course will cover the most important aspects

  • Upload
    others

  • View
    7

  • Download
    0

Embed Size (px)

Citation preview

Page 1: ECE-175 Elements of Machine Intelligence · ECE-175 Elements of Machine Intelligence Nuno Vasconcelos ECE Department, UCSD. 2 The course The course will cover the most important aspects

ECE-175Elements of Machine

Intelligence

Nuno Vasconcelos ECE Department, UCSD

Page 2: ECE-175 Elements of Machine Intelligence · ECE-175 Elements of Machine Intelligence Nuno Vasconcelos ECE Department, UCSD. 2 The course The course will cover the most important aspects

2

The courseThe course will cover the most important aspects of machine learning and pattern recognitionWe will cover a lot of ground, at the end of the quarter you will know how to implement a lot of things that may seem very complicated today

Homework: dependent on grading help. If we have it, it will count 20%

Exams: 1 mid-term, date TBA- 35%1 final – 45% (covers everything)

Page 3: ECE-175 Elements of Machine Intelligence · ECE-175 Elements of Machine Intelligence Nuno Vasconcelos ECE Department, UCSD. 2 The course The course will cover the most important aspects

3

Homework policiesHomework is individualOK to work on problems with someone else but you have to:• write your own solution• write down the names of who you collaborated with

Homework is due one week after it is issued. No exceptions. All grading will be with respect to the class average• do not get discouraged if a problem is hard, just put some extra

effort into solving it. It is your opportunity to shine…• if someone else is cheating, your grades will suffer. It is going to

hurt you, not me or the TA

Page 4: ECE-175 Elements of Machine Intelligence · ECE-175 Elements of Machine Intelligence Nuno Vasconcelos ECE Department, UCSD. 2 The course The course will cover the most important aspects

4

ResourcesCourse web page:

http://www.svcl.ucsd.edu/~courses/ece175/• all handouts, problem sets, code will be available there

Me: • Nuno Vasconcelos, [email protected], EBU 1- 5602• office hours: Thursday, 6:30-7:30 PM

My assistant:• Priscilla Haase ([email protected]), EBU1 - 3401, may

sometimes be involved in administrative issues

Page 5: ECE-175 Elements of Machine Intelligence · ECE-175 Elements of Machine Intelligence Nuno Vasconcelos ECE Department, UCSD. 2 The course The course will cover the most important aspects

5

TextsRequired:• “Introduction to Machine Learning”• Ethem Alpaydin, MIT Press, 2004

Various other good, but optional, texts:• “Pattern Classification”, Duda, Hart, Stork, Wiley, 2001• “Elements of Statistical Learning”, Hastie, Tibshirani, Fredman,

2001• “Bayesian Data Analysis”, Gelman, Rubin, Stern, 2003.

Stuff you must know inside out:• “Linear Algebra”, Gilbert Strang, 1988• “Fundamentals of Applied Probability”, Drake, McGraw-Hill, 1967

Page 6: ECE-175 Elements of Machine Intelligence · ECE-175 Elements of Machine Intelligence Nuno Vasconcelos ECE Department, UCSD. 2 The course The course will cover the most important aspects

6

The coursewhy machine learning?there are many processes in the world that are ruled by deterministic equations• e.g. f = ma; linear systems and convolution, Fourier, etc, various

chemical laws• there may be some “noise”, “error”, “variability”, but we can leave

with those • we don’t need statistical learning

learning is needed when• there is a need for predictions about variables in the world, Y• that depend on factors (other variables) X• in a way that is impossible or too difficult to derive an equation

for.

Page 7: ECE-175 Elements of Machine Intelligence · ECE-175 Elements of Machine Intelligence Nuno Vasconcelos ECE Department, UCSD. 2 The course The course will cover the most important aspects

7

Examplesdata-mining view:• large amounts of data that does not follow deterministic rules• e.g. given an history of thousands of customer records and some

questions that I can ask you, how do I predict that you will pay on time?

• impossible to derive a theorem for this, must be learned

while many associate learning with data-mining, it is by no means the only or more important applicationsignal processing view:• signals combine in ways that depend on “hidden structure” (e.g.

speech waveforms depend on language, grammar, etc.)• signals are usually subject to significant amounts of “noise”

(which sometimes means “things we do not know how to model”)

Page 8: ECE-175 Elements of Machine Intelligence · ECE-175 Elements of Machine Intelligence Nuno Vasconcelos ECE Department, UCSD. 2 The course The course will cover the most important aspects

8

Examples (cont’d)signal processing view:• e.g. the cocktail party problem, • although there are all these people

talking, I can figure everything out.• how do I build a chip to separate the

speakers?• model the hidden dependence as

• a linear combination of independent sources

• noise

• many other examples in the areas of wireless, communications, signal restoration, etc.

Page 9: ECE-175 Elements of Machine Intelligence · ECE-175 Elements of Machine Intelligence Nuno Vasconcelos ECE Department, UCSD. 2 The course The course will cover the most important aspects

9

Examples (cont’d)perception/AI view:• it is a complex world, I cannot model

everything in detail• rely on probabilistic models that explicitly

account for the variability• use the laws of probability to make

inferences, e.g. what is• P( burglar | alarm, no earthquake) is high• P( burglar | alarm, earthquake) is low

• a whole field that studies “perception as Bayesian inference”

• perception really just confirms what you already know

• priors + observations = robust inference

Page 10: ECE-175 Elements of Machine Intelligence · ECE-175 Elements of Machine Intelligence Nuno Vasconcelos ECE Department, UCSD. 2 The course The course will cover the most important aspects

10

Examples (cont’d)communications view:• detection problems:

• I see Y, and know something about the statistics of the channel. What was X?

• this is the canonic detection problem that appears all over learning.

• for example, face detection in computer vision: “I see pixel array Y. Is it a face?”

channelX Y

Page 11: ECE-175 Elements of Machine Intelligence · ECE-175 Elements of Machine Intelligence Nuno Vasconcelos ECE Department, UCSD. 2 The course The course will cover the most important aspects

11

Statistical learninggoal: given a function

and a collection of exampledata-points, learn what the function f(.) is. this is called training.two major types of learning:

• unsupervised: only X is known, usually referred to as clustering;• supervised: both are known during training, only X known at test

time, usually referred to as classification or regression.

)(xfy =x(.)f

Page 12: ECE-175 Elements of Machine Intelligence · ECE-175 Elements of Machine Intelligence Nuno Vasconcelos ECE Department, UCSD. 2 The course The course will cover the most important aspects

12

Supervised learningX can be anything, but the type of Y dictates the type of supervised learning problem

• Y in {0,1} referred to as detection• Y in {0, ..., M-1} referred to as

classification• Y real referred to as regression

theory is quite similar, algorithms similar most of the timewe will emphasize classification, but will talk about regression when particularly insightful

Page 13: ECE-175 Elements of Machine Intelligence · ECE-175 Elements of Machine Intelligence Nuno Vasconcelos ECE Department, UCSD. 2 The course The course will cover the most important aspects

13

Exampleclassifying fish:• fish roll down a conveyer belt• camera takes a picture• goal: is this a salmon or a sea-

bass?

Q: what is X? What featuresdo I use to distinguish between the two fish?this is somewhat of an art-form. Frequently, the best is to ask experts. e.g. obvious! use length and scale width!

Page 14: ECE-175 Elements of Machine Intelligence · ECE-175 Elements of Machine Intelligence Nuno Vasconcelos ECE Department, UCSD. 2 The course The course will cover the most important aspects

14

Classification/detectiontwo major types of classifiersdiscriminant: directly recover the decision boundary that best separates the classes;generative: fit a probability model to each class and then “analyze” the models to find the border.a lot more on this later!

Page 15: ECE-175 Elements of Machine Intelligence · ECE-175 Elements of Machine Intelligence Nuno Vasconcelos ECE Department, UCSD. 2 The course The course will cover the most important aspects

15

Cautionhow do we know learning worked?we care about generalization, i.e. accuracy outside training setmodels that are too powerful canlead to over-fitting:• e.g. in regression I can always fit

exactly n pts with polynomial oforder n-1.

• is this good? how likely is the errorto be small outside the training set?

• similar problem for classification

fundamental LAW: only test set results matter!!!

Page 16: ECE-175 Elements of Machine Intelligence · ECE-175 Elements of Machine Intelligence Nuno Vasconcelos ECE Department, UCSD. 2 The course The course will cover the most important aspects

16

Generalizationgood generalization requires controlling the trade-off between training and test error• training error large, test error large• training error smaller, test error smaller• training error smallest, test error largest

this trade-off is known by many namesin the generative classification world it is usually due to the bias-variance trade-off of the class modelswill look at this in detail

Page 17: ECE-175 Elements of Machine Intelligence · ECE-175 Elements of Machine Intelligence Nuno Vasconcelos ECE Department, UCSD. 2 The course The course will cover the most important aspects

17

Generative learningeach class is characterized by a probability density function (class conditional density)a model is adopted, e.g. a Gaussiantraining data used to estimate model parametersoverall the process is referred to as density estimationthe simplest example would be to use histograms

Page 18: ECE-175 Elements of Machine Intelligence · ECE-175 Elements of Machine Intelligence Nuno Vasconcelos ECE Department, UCSD. 2 The course The course will cover the most important aspects

18

Decision rulesgiven class models, Bayesiandecision theory provides us withoptimal rules for classificationoptimal here means minimumprobability of error, for examplewe will • study BDT in detail, • establish connections to other

decision principles (e.g. linear discriminants)• show that Bayesian decisions are usually intuitive

derive optimal rules for a range of classifiers

Page 19: ECE-175 Elements of Machine Intelligence · ECE-175 Elements of Machine Intelligence Nuno Vasconcelos ECE Department, UCSD. 2 The course The course will cover the most important aspects

19

Features and dimensionalityfor most of what we have seen so far• theory is well understood• algorithms available• limitations characterized

usually, good features are an art-formwe will survey traditional techniques• principal components• linear discriminant analysis

and some more recent methods• independent components• information theoretic methods

Page 20: ECE-175 Elements of Machine Intelligence · ECE-175 Elements of Machine Intelligence Nuno Vasconcelos ECE Department, UCSD. 2 The course The course will cover the most important aspects

20

Discriminant learninginstead of learning models and deriving boundaryderive boundary directlymany methods, the simplest case is the so-called hyperplane classifier• simply find the hyperplane that best separates the classes

Page 21: ECE-175 Elements of Machine Intelligence · ECE-175 Elements of Machine Intelligence Nuno Vasconcelos ECE Department, UCSD. 2 The course The course will cover the most important aspects

21

Support vector machineshow do we do this?the most recent classifiers are called support vector machines• if the classes are separable, • the best performance is obtained by maximizing the margin• this is the distance between plane and closest

point on each side

Page 22: ECE-175 Elements of Machine Intelligence · ECE-175 Elements of Machine Intelligence Nuno Vasconcelos ECE Department, UCSD. 2 The course The course will cover the most important aspects

22

Support vector machinessince, for separable classes, the training error can be made zero by classifying each point correctlythis can be implemented by solving the optimization problem

this is an optimization problem withn constraints, not trivial but solvablethe solution is the support-vector machine (points on the margin are “support vectors”)what if a hyperplane is not good enough?

)(maxarg* www

margin =

lxts l ∀ classified correctly . w*

Page 23: ECE-175 Elements of Machine Intelligence · ECE-175 Elements of Machine Intelligence Nuno Vasconcelos ECE Department, UCSD. 2 The course The course will cover the most important aspects

23

Kernelsthe trick is to map the problem to a higher dimensional space:• non-linear boundary in original space• becomes hyperplane in transformed

space

this can be done efficiently by the introduction of a kernel functionclassification problem is mapped into a reproducing kernel Hilbert spacekernels are at the core of the success of SVM classificationmost classical techniques (e.g. PCA, LDA, ICA, etc.) can be kernelizedwith significant improvement

xx

x

x

x

xx

x

xx

x

x

oo

o

o

o

ooo

ooo

o

x1

x2

xx

x

x

x

xx

x

xx

x

x

oo

o

o

o

o oo

o oo

ox1

x3x2

xn

Kernel-basedfeature transformation

Page 24: ECE-175 Elements of Machine Intelligence · ECE-175 Elements of Machine Intelligence Nuno Vasconcelos ECE Department, UCSD. 2 The course The course will cover the most important aspects

24

Unsupervised learningso far, we have talked about supervised learning:• I know the class of each point

in may problems this is not the case (e.g. image segmentation)

Page 25: ECE-175 Elements of Machine Intelligence · ECE-175 Elements of Machine Intelligence Nuno Vasconcelos ECE Department, UCSD. 2 The course The course will cover the most important aspects

25

Unsupervised learningin these problems we aregiven X but not Ythe standard algorithmsfor this are iterative:• start from best guess• given Y estimates fit

class models• given class models

re-estimate Ys

the procedure usually converges to an optimal solution, although not necessarily the global optimumperformance worst than that of supervised classifier, but this is the best we can do anyway...

Page 26: ECE-175 Elements of Machine Intelligence · ECE-175 Elements of Machine Intelligence Nuno Vasconcelos ECE Department, UCSD. 2 The course The course will cover the most important aspects

26

Reasons to take the coursestatistical learning• tremendous amount of theory• but things invariably go wrong• too little data, noise, too many dimensions, training sets that do

not reflect all possible variability, etc.

good learning solutions require:• knowledge of the domain (e.g. “these are the features to use”)• knowledge of the available techniques, their limitations, etc.

in the absence of either of these you will fail!we will cover the basics, but will talk about quite advanced concepts. easier scenario in which to understand them

Page 27: ECE-175 Elements of Machine Intelligence · ECE-175 Elements of Machine Intelligence Nuno Vasconcelos ECE Department, UCSD. 2 The course The course will cover the most important aspects

27