Upload
others
View
4
Download
0
Embed Size (px)
Citation preview
Machine LearningApplied to Computer Vision
Adın Ramırez [email protected]
2nd. Semester 2016Adapted from J. Hays, I. Guyon, E. Sudderth, M. Johnson, D. Hoiem
Learning Goals
Overview
Principal idea: make predictions or decisions through dataWe won’t be getting much into the advanced detailsWe will see the use of machine learning as tools (in general)
AI 1
What is Computer Vision?
Computer Vision and Related Fields
AI 2
What is Computer Vision?
Brief History of Computer Vision
1960: interpretation of synthetic worlds1966: Minsky assigns a homework to a student, plug thecamera to the computer and make it interpret what it is seeing1970: progress interpreting selected images1980: ANNs come and go, there is a tendency towardsgeometry and math1990: face recognition and statistical analysis2000: broader recognition, several databases appear, and westart processing videos2010: deep learning2030: robot revolution?
https://what-if.xkcd.com/5/AI 3
Machine Learning
Machine Learning Impact
It is the major export from computing to other fields of scienceSome fields that use it
I High energy physicsI Market analysisI Systems diagnosticsI Bio-informaticsI Text classificationI Machine visionI . . .
AI 4
Machine Learning
Image Classification
Trainingimages
Image features Classifiertraining
Trainedclassifier
Training labelsTraining
Test image
Image features Trainedclassifier
Prediction:outside
Evaluation
AI 5
Machine Learning
ExamplesScene classification
¿Is this a kitchen?
AI 6
Image Features
Image Features
Trainingimages
Image features Classifiertraining
Trainedclassifier
Training labelsTraining
AI 7
Image Features
General Principles of Representation
CoverageI Lets make sure that all the relevant information is covered and
capturedConcise
I Minimize the number of features without sacrificing coverageDirect
I The ideal features are used in prediction
AI 8
Image Features
Image Representation
TemplatesI IntensityI GradientsI etc.
HistogramsI ColorI TextureI SIFTI DescriptorsI etc.
AI 9
Classifiers
Classifiers
Trainingimages
Image features Classifiertraining
Trainedclassifier
Training labelsTraining
AI 10
Classifiers
Learn a Classifier
Given a set of features with their corresponding labelsLearn a function f that predicts the labels of the features
y
x
f
AI 11
Classifiers
Many Classifiers
Support Vector Machines (SVM)Neural NetworksNaıve BayesBayesian NetworksLogistic RegressionRandom ForestsBoosted Decision Treesk-Nearest Neighborsetc.Which one is the best?
AI 12
Classifiers
One way of thinking about them
Training labels identify the examples that are equal ordifferent (according to the classification problem)Features and the distance measures define a visual similarityClassifiers try to learn the weights or parameters for thefeatures and the distance measures such that the visualsimilarity predicts the similarity of the labelsThe decision of using machine learning methods is moreimportant than the particular method to use
AI 13
Problems
Machine Learning Problems
Supervised Learning Unsupervised Learning
Disc
rete
Classification orcategorization Clustering
Con
tinuo
us
Regression DimensionalityReduction
AI 14
Dimensionality Reduction
Machine Learning Problems
Supervised Learning Unsupervised Learning
Disc
rete
Classification orcategorization Clustering
Con
tinuo
us
Regression DimensionalityReduction
AI 15
Dimensionality Reduction
Reduction
PCA, ICA, LLE, IsomapPCA (Principal ComponentAnalysis) is a the most commontechniqueTakes advantage of the correlationin the dimensions to produce a newrepresentation in lower dimensionalspacePCA must be used to reduce thedimensionality of data, not todiscover patterns or predictDo not assign a semantic value tothe new base
AI 16
Clustering
Machine Learning Problems
Supervised Learning Unsupervised Learning
Disc
rete
Classification orcategorization Clustering
Con
tinuo
us
Regression DimensionalityReduction
AI 17
Examples
Clustering Examples
Goal: decompose and image into its similar parts that arerelevant
AI 18
Examples
Segmentation for EfficiencySuperpixels
Felz
ensw
alb
and
Hut
tenl
oche
r20
04Sh
iand
Mal
ik20
01
AI 19
Examples
Segmentation as Result
AI 20
Examples
Types of Segmentation
AI 21
Clustering
Objectives
The clustering objectives are grouping similar points andrepresentations into the same tokenKey challenges
I What makes two points, images, or patches similar?I How do we compute a global grouping from the similar pairs?
AI 22
Clustering
Why do we group?
Summarize the dataI Look at big volumes of dataI Compression or elimination of noised based on data (patches)I Representation of a continuous vector (and probably big) with
a cluster numberCount
I Histograms of texture, color, feature vectors (SIFT)Segment
I Separate the images into several regionsPredict
I Images of the same cluster can have same labels
AI 23
Clustering
How do we generate the clusters?
k-meansI Iteratively re assign the points to the closer cluster according
to the center of mass of the clusterAgglomerative Clustering
I Start with each point as its own clusterI Iteratively mix the closer clusters
Mean-shift ClusteringI Estimate the modes of the pdf
Spectral ClusteringI Divide the nodes of a graph based on the similarity weights of
the edges
AI 24
K-means
Clustering in a Nutshell
Goal: create clusters to minimize the variance between thegiven dataPreserve information
c∗, δ∗ = arg minc,δ
1N
N∑j
K∑i
δij (ci − xj)2
I ci is the ith cluster centerI δij whether xj must be assigned to ci
I xj is the jth datum
AI 25
K-means
K-means
1. Pick kcenters ran-domly
2. Assigneach pointto the clos-est center
3. Computenew centers
Iterate until convergence
AI 26
K-means
K-means
1 Initialize the centers to the clusters c0, t = 02 Assign each point to the closest cluster
δt = arg minδ
1N
N∑j
K∑i
δt−1ij
(ct−i
i − xj
)2
3 Update the cluster centers using the mean of the points
cti = 1
N
N∑j
δtijxj
4 Repeat 2–3 until there is no point reassignment
AI 27
K-means
K-means convergence to a local minima
AI 28
K-means
Design decisions
InitializationI Pick k random points as initial centersI Or greedily select k points to minimize the residual
Distance measuresI Traditionally, Euclidean (l2), but we can use others
OptimizationI Will converge to a local minimaI We can do several runs
AI 29
K-means
How to evaluate clusters?
GenerativeI How good are the reconstructed points from the clusters?
DiscriminativeI How good does the clusters correspond to the tags?I PurityI Note that non supervised clustering does not try to be
discriminative
AI 30
K-means
How do we pick the number of clusters?
We use a validation setTry different number of clusters and watch the performanceWhen constructing dictionaries, the more the merrier
AI 31
K-means
K-means advantages and disadvantages
AdvantagesI Fin cluster centers that minimize the conditional variance
(good representation of the data)I Simple and fastI Easy to implement
DisadvantagesI Need to pick KI Sensitive to outliersI Prone to local minimaI Every cluster has the same parameters (e.g., the distance
measure is not adaptive)I Can be slow: each iteration O(KNd), N points of d
dimensionsUse
I They are not used for pixel segmentation
AI 32
Visual Dictionaries
Visual Dictionaries
Samples of patches from adatabase (e.g.,128-dimensional vectors)Generate clusters from thepatchesThe cluster centers are thedictionaryAssign each codeword(number) to each new patchaccording to the nearestcluster
Sivic et al. ICCV 2005http://www.robots.ox.ac.uk/˜vgg/publications/papers/sivic05b.pdf
AI 33
Conclusions
Important points
Many classifiers, knowing which and what we need isimportantDecision of using a type of method is more important thanthe method itselfConsider data, examples, previous knowledge, etc.Reduction problems, solution PCA (and friends)Clustering problems, solution k-means (and friends)
AI 34
Conclusions
Next class
Supervised Learning Unsupervised Learning
Disc
rete
Classification orcategorization Clustering
Con
tinuo
us
Regression DimensionalityReduction
AI 35
Homework
Homework
Install OpenCVGet demo camshifthttps://github.com/opencv/opencv/blob/master/samples/cpp/camshiftdemo.cpp
Understand what the demo is doingWrite one page report
I Due on one week
AI 36
Homework
Report
#i n c l u d e <iostream >us ing namespace std;
i n t fib( i n t x) {i f (x == 0)r e t u r n 0;
i f (x == 1)r e t u r n 1;
r e t u r n fib(x -1)+fib(x -2);}
i n t main () {i n t n;cin >> n;cout << fib(n) << endl;
}
Bad exampleThe code reads an integer. Then calls arecursive function and computes the sum oftwo calls of the same function summingthem by reducing the input by one and two,respectively.
Good exampleA Fibonacci number, fn, is computedthrough a recursive expression
fn = fn−1 + fn−2,
where the initial values of the sequence aref0 = 0 and f1 = 1. This recursive equationis implemented as such through thefunction fib.
AI 37