k-nn (1)

  • Upload
    wiznick

  • View
    221

  • Download
    0

Embed Size (px)

Citation preview

  • 8/7/2019 k-nn (1)

    1/16

    K-Nearest NeighborLearning

    DipanjanChakraborty

  • 8/7/2019 k-nn (1)

    2/16

    Different Learning Methods Eager Learning

    Explicit description of target function on

    the whole training set

    Instance-based Learning

    Learning=storing all training instances

    Classification=assigning target functionto a new instance

    Referred to as Lazy learning

  • 8/7/2019 k-nn (1)

    3/16

    Different Learning Methods Eager Learning

    Any random movement=>Its a mouse

    I saw a mouse!

  • 8/7/2019 k-nn (1)

    4/16

    Instance-based Learning

    Its very similar to aDesktop!!

  • 8/7/2019 k-nn (1)

    5/16

    Instance-based Learning K-Nearest Neighbor Algorithm

    Weighted Regression

    Case-based reasoning

  • 8/7/2019 k-nn (1)

    6/16

    K-Nearest Neighbor Features

    All instances correspond to points in an

    n-dimensional Euclidean space Classification is delayed till a new

    instance arrives

    Classification done by comparing

    feature vectors of the different points Target function may be discrete or real-

    valued

  • 8/7/2019 k-nn (1)

    7/16

    1-Nearest Neighbor

  • 8/7/2019 k-nn (1)

    8/16

    3-Nearest Neighbor

  • 8/7/2019 k-nn (1)

    9/16

    K-Nearest Neighbor An arbitrary instance is represented by

    (a1(x), a2(x), a3(x),.., an(x))

    ai(x) denotes features Euclidean distance between two instances

    d(xi, xj)=sqrt (sum for r=1 to n (ar(xi) -ar(xj))

    2)

    Continuous valued target function

    mean value of the k nearest trainingexamples

  • 8/7/2019 k-nn (1)

    10/16

    Voronoi Diagram Decision surface formed by the

    training examples

  • 8/7/2019 k-nn (1)

    11/16

    Distance-Weighted Nearest

    Neighbor Algorithm Assign weights to the neighbors

    based on their distance from the

    query point Weight may be inverse square of the

    distances

    All training points may influence aparticular instance

    Shepards method

  • 8/7/2019 k-nn (1)

    12/16

    Remarks+Highly effective inductive inference

    method for noisy training data and

    complex target functions+Target function for a whole space

    may be described as a combinationof less complex local approximations

    +Learning is very simple

    - Classification is time consuming

  • 8/7/2019 k-nn (1)

    13/16

    Remarks- Curse of Dimensionality

  • 8/7/2019 k-nn (1)

    14/16

    Remarks- Curse of Dimensionality

  • 8/7/2019 k-nn (1)

    15/16

    Remarks- Curse of Dimensionality

  • 8/7/2019 k-nn (1)

    16/16

    Remarks Efficient memory indexing

    To retrieve the stored training

    examples (kd-tree)