33
QML Amrit Singhal Introduction Defining the problem Various Metrics Some useful submodules Amplitude Estimation Durr-Hoyer Minimum Finding SWAP Test Quantum a +1 Inner Product Method Notations Algorithm Coherent Amplitude estimation Hamming Distance Method Prerequisites Algorithm Performance Quantum Machine Learning - Nearest Neighbour Algorithms Amrit Singhal Indian Institute of Technology Kanpur [email protected] November 14, 2017

Quantum Machine Learning - Nearest Neighbour …...Then, r 1 r2 0f(0;i) r2 0 max ei ˚ 0f(0;i)j0i+ r 0f(0;i) r0 max ei˚ j1i! (h˚ji) = 1 d X i v ji 0i r j maxr 0 max = hv 0jv ji dr

  • Upload
    others

  • View
    1

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Quantum Machine Learning - Nearest Neighbour …...Then, r 1 r2 0f(0;i) r2 0 max ei ˚ 0f(0;i)j0i+ r 0f(0;i) r0 max ei˚ j1i! (h˚ji) = 1 d X i v ji 0i r j maxr 0 max = hv 0jv ji dr

QML

Amrit Singhal

Introduction

Defining theproblem

Various Metrics

Some usefulsubmodules

AmplitudeEstimation

Durr-HoyerMinimumFinding

SWAP Test

Quantum a + 1

Inner ProductMethod

Notations

Algorithm

CoherentAmplitudeestimation

HammingDistanceMethod

Prerequisites

Algorithm

Performance

Quantum Machine Learning - NearestNeighbour Algorithms

Amrit Singhal

Indian Institute of Technology Kanpur

[email protected]

November 14, 2017

Page 2: Quantum Machine Learning - Nearest Neighbour …...Then, r 1 r2 0f(0;i) r2 0 max ei ˚ 0f(0;i)j0i+ r 0f(0;i) r0 max ei˚ j1i! (h˚ji) = 1 d X i v ji 0i r j maxr 0 max = hv 0jv ji dr

QML

Amrit Singhal

Introduction

Defining theproblem

Various Metrics

Some usefulsubmodules

AmplitudeEstimation

Durr-HoyerMinimumFinding

SWAP Test

Quantum a + 1

Inner ProductMethod

Notations

Algorithm

CoherentAmplitudeestimation

HammingDistanceMethod

Prerequisites

Algorithm

Performance

Overview

1 IntroductionDefining the problemVarious Metrics

2 Some useful submodulesAmplitude EstimationDurr-Hoyer Minimum FindingSWAP TestQuantum a + 1

3 Inner Product MethodNotationsAlgorithmCoherent Amplitude estimation

4 Hamming Distance MethodPrerequisitesAlgorithm

5 Performance

Page 3: Quantum Machine Learning - Nearest Neighbour …...Then, r 1 r2 0f(0;i) r2 0 max ei ˚ 0f(0;i)j0i+ r 0f(0;i) r0 max ei˚ j1i! (h˚ji) = 1 d X i v ji 0i r j maxr 0 max = hv 0jv ji dr

QML

Amrit Singhal

Introduction

Defining theproblem

Various Metrics

Some usefulsubmodules

AmplitudeEstimation

Durr-HoyerMinimumFinding

SWAP Test

Quantum a + 1

Inner ProductMethod

Notations

Algorithm

CoherentAmplitudeestimation

HammingDistanceMethod

Prerequisites

Algorithm

Performance

Based on

This presentation on primarily based on the following twopapers:

1 Quantum Algorithms for Nearest-Neighbour Methods forSupervised and Unsupervised Learning, by N. Weibe, A.Kapoor, and K. M. Svore

2 Quantum Algorithm for K-Nearest Neighbors ClassificationBased on the Metric of Hamming Distance, by Y. Ruan,X. Xue, H. Liu, J. Tan, and X. Li.

Page 4: Quantum Machine Learning - Nearest Neighbour …...Then, r 1 r2 0f(0;i) r2 0 max ei ˚ 0f(0;i)j0i+ r 0f(0;i) r0 max ei˚ j1i! (h˚ji) = 1 d X i v ji 0i r j maxr 0 max = hv 0jv ji dr

QML

Amrit Singhal

Introduction

Defining theproblem

Various Metrics

Some usefulsubmodules

AmplitudeEstimation

Durr-HoyerMinimumFinding

SWAP Test

Quantum a + 1

Inner ProductMethod

Notations

Algorithm

CoherentAmplitudeestimation

HammingDistanceMethod

Prerequisites

Algorithm

Performance

What is Nearest Neighbour Algorithm?

Aims to classify a new piece of data (known as testvector), by comparing it to a set of training data that hasalready been classified.

Test sample then assigned to the class of the trainingexample that has the most similar features.

These features of the data that are used for comparisionare usually expressed as real-valued vectors.

Very accurate given a large amount of training data.

Page 5: Quantum Machine Learning - Nearest Neighbour …...Then, r 1 r2 0f(0;i) r2 0 max ei ˚ 0f(0;i)j0i+ r 0f(0;i) r0 max ei˚ j1i! (h˚ji) = 1 d X i v ji 0i r j maxr 0 max = hv 0jv ji dr

QML

Amrit Singhal

Introduction

Defining theproblem

Various Metrics

Some usefulsubmodules

AmplitudeEstimation

Durr-HoyerMinimumFinding

SWAP Test

Quantum a + 1

Inner ProductMethod

Notations

Algorithm

CoherentAmplitudeestimation

HammingDistanceMethod

Prerequisites

Algorithm

Performance

K-Nearest Neighbours ?

Given a testing sample, find its k nearest neighbours basedon some distance metric, and then determine its categoryaccording to the information of these neighbours.

Generally, the algorithm uses ”majority voting”, i.e. thetesting sample is labelled as the leading category tag of itsk nearest neighbours.

Page 6: Quantum Machine Learning - Nearest Neighbour …...Then, r 1 r2 0f(0;i) r2 0 max ei ˚ 0f(0;i)j0i+ r 0f(0;i) r0 max ei˚ j1i! (h˚ji) = 1 d X i v ji 0i r j maxr 0 max = hv 0jv ji dr

QML

Amrit Singhal

Introduction

Defining theproblem

Various Metrics

Some usefulsubmodules

AmplitudeEstimation

Durr-HoyerMinimumFinding

SWAP Test

Quantum a + 1

Inner ProductMethod

Notations

Algorithm

CoherentAmplitudeestimation

HammingDistanceMethod

Prerequisites

Algorithm

Performance

Various Metrics

As we have seen that the nearest neighbour algorithm relies onthe use of a distance metric between the data vectors.Depending on which metric we choose, we can have slightlyvarying results, with varying accuracies. Some of the popularmetrics are:

1 Inner Product

2 Hamming Distance

3 Euclidean Distance

Page 7: Quantum Machine Learning - Nearest Neighbour …...Then, r 1 r2 0f(0;i) r2 0 max ei ˚ 0f(0;i)j0i+ r 0f(0;i) r0 max ei˚ j1i! (h˚ji) = 1 d X i v ji 0i r j maxr 0 max = hv 0jv ji dr

QML

Amrit Singhal

Introduction

Defining theproblem

Various Metrics

Some usefulsubmodules

AmplitudeEstimation

Durr-HoyerMinimumFinding

SWAP Test

Quantum a + 1

Inner ProductMethod

Notations

Algorithm

CoherentAmplitudeestimation

HammingDistanceMethod

Prerequisites

Algorithm

Performance

Some useful submodules

There are a few quantum algorithms that are utilised in thefinal kNN algorithms that will be described. Let us have a lookat them first.

Page 8: Quantum Machine Learning - Nearest Neighbour …...Then, r 1 r2 0f(0;i) r2 0 max ei ˚ 0f(0;i)j0i+ r 0f(0;i) r0 max ei˚ j1i! (h˚ji) = 1 d X i v ji 0i r j maxr 0 max = hv 0jv ji dr

QML

Amrit Singhal

Introduction

Defining theproblem

Various Metrics

Some usefulsubmodules

AmplitudeEstimation

Durr-HoyerMinimumFinding

SWAP Test

Quantum a + 1

Inner ProductMethod

Notations

Algorithm

CoherentAmplitudeestimation

HammingDistanceMethod

Prerequisites

Algorithm

Performance

Amplitude Estimation

Given a state |Ψ〉 = A |0〉.If |Ψ〉 = |Ψ1〉+ |Ψ2〉, as a combination of the good and badcomponents of |Ψ〉 respectively, then amplitude estimation isthe problem of estimating a = 〈Ψ1|Ψ1〉, the probability that ameasurement yields a good state.

Page 9: Quantum Machine Learning - Nearest Neighbour …...Then, r 1 r2 0f(0;i) r2 0 max ei ˚ 0f(0;i)j0i+ r 0f(0;i) r0 max ei˚ j1i! (h˚ji) = 1 d X i v ji 0i r j maxr 0 max = hv 0jv ji dr

QML

Amrit Singhal

Introduction

Defining theproblem

Various Metrics

Some usefulsubmodules

AmplitudeEstimation

Durr-HoyerMinimumFinding

SWAP Test

Quantum a + 1

Inner ProductMethod

Notations

Algorithm

CoherentAmplitudeestimation

HammingDistanceMethod

Prerequisites

Algorithm

Performance

Amplitude Estimation

Theorem (Amplitude Estimation)

For any positive integers k and L, the amplitude estimationalgorithm outputs a(0 ≤ a ≤ 1) such that:

|a− a| ≤ 2πk

√a(1− a)

L+

(πk

L

)2

with probabilty ≥ 8/π2 when k = 1 and with probability≥ 1− 1/(2(k − 1)) for k ≥ 2. It uses exactly L iterations of theGrover’s algorithm.

Page 10: Quantum Machine Learning - Nearest Neighbour …...Then, r 1 r2 0f(0;i) r2 0 max ei ˚ 0f(0;i)j0i+ r 0f(0;i) r0 max ei˚ j1i! (h˚ji) = 1 d X i v ji 0i r j maxr 0 max = hv 0jv ji dr

QML

Amrit Singhal

Introduction

Defining theproblem

Various Metrics

Some usefulsubmodules

AmplitudeEstimation

Durr-HoyerMinimumFinding

SWAP Test

Quantum a + 1

Inner ProductMethod

Notations

Algorithm

CoherentAmplitudeestimation

HammingDistanceMethod

Prerequisites

Algorithm

Performance

Durr-Hoyer Minimum Finding

Theorem

The expected number of Grover’s iterations needed to learnminyi : i = 1, 2, · · · ,M is bounded by

45

2

√M

Page 11: Quantum Machine Learning - Nearest Neighbour …...Then, r 1 r2 0f(0;i) r2 0 max ei ˚ 0f(0;i)j0i+ r 0f(0;i) r0 max ei˚ j1i! (h˚ji) = 1 d X i v ji 0i r j maxr 0 max = hv 0jv ji dr

QML

Amrit Singhal

Introduction

Defining theproblem

Various Metrics

Some usefulsubmodules

AmplitudeEstimation

Durr-HoyerMinimumFinding

SWAP Test

Quantum a + 1

Inner ProductMethod

Notations

Algorithm

CoherentAmplitudeestimation

HammingDistanceMethod

Prerequisites

Algorithm

Performance

SWAP Gate

Page 12: Quantum Machine Learning - Nearest Neighbour …...Then, r 1 r2 0f(0;i) r2 0 max ei ˚ 0f(0;i)j0i+ r 0f(0;i) r0 max ei˚ j1i! (h˚ji) = 1 d X i v ji 0i r j maxr 0 max = hv 0jv ji dr

QML

Amrit Singhal

Introduction

Defining theproblem

Various Metrics

Some usefulsubmodules

AmplitudeEstimation

Durr-HoyerMinimumFinding

SWAP Test

Quantum a + 1

Inner ProductMethod

Notations

Algorithm

CoherentAmplitudeestimation

HammingDistanceMethod

Prerequisites

Algorithm

Performance

SWAP test

1 Start with the state |0〉 |ψ〉 (|ψ〉 = |x〉 |y〉)2 Apply Hadamard on the first qubit to obtain

1√2

(|0〉+ |1〉) |ψ〉.3 Apply controlled SWAP gate to obtain

1√2

(|0〉 |ψ〉+ |1〉F (|ψ〉))

4 Apply Hadamard on the first qubit again to obtain(I+F

2

)|0〉 |ψ〉+

(I−F

2

)|1〉 |ψ〉 = P0 |0〉 |ψ〉+ P1 |1〉 |ψ〉

Pr(0) = 〈ψ|P0|ψ〉 =1

2〈xy |xy + yx〉

=1

2

(1 + (〈x |y〉)2

)So,

| 〈x |y〉 |2 = (2P(0)− 1)

Page 13: Quantum Machine Learning - Nearest Neighbour …...Then, r 1 r2 0f(0;i) r2 0 max ei ˚ 0f(0;i)j0i+ r 0f(0;i) r0 max ei˚ j1i! (h˚ji) = 1 d X i v ji 0i r j maxr 0 max = hv 0jv ji dr

QML

Amrit Singhal

Introduction

Defining theproblem

Various Metrics

Some usefulsubmodules

AmplitudeEstimation

Durr-HoyerMinimumFinding

SWAP Test

Quantum a + 1

Inner ProductMethod

Notations

Algorithm

CoherentAmplitudeestimation

HammingDistanceMethod

Prerequisites

Algorithm

Performance

Quantum a + 1 circuit

Page 14: Quantum Machine Learning - Nearest Neighbour …...Then, r 1 r2 0f(0;i) r2 0 max ei ˚ 0f(0;i)j0i+ r 0f(0;i) r0 max ei˚ j1i! (h˚ji) = 1 d X i v ji 0i r j maxr 0 max = hv 0jv ji dr

QML

Amrit Singhal

Introduction

Defining theproblem

Various Metrics

Some usefulsubmodules

AmplitudeEstimation

Durr-HoyerMinimumFinding

SWAP Test

Quantum a + 1

Inner ProductMethod

Notations

Algorithm

CoherentAmplitudeestimation

HammingDistanceMethod

Prerequisites

Algorithm

Performance

Notations and Assumptions

The test point is v0 := u

Training examples consist of vj , for j ∈ 1, 2, · · · ,M.vji = rjie

iφji is the i-th element of the j-th vector.

The input vectors are d-sparse.

f (j , l) is the location of the l-th non-zero entry in vj .

Quantum Oracles are provided in the form:

O |j〉 |i〉 |0〉 = |j〉 |i〉 |vji 〉

F |j〉 |l〉 = |j〉 |f (j , l)〉

Page 15: Quantum Machine Learning - Nearest Neighbour …...Then, r 1 r2 0f(0;i) r2 0 max ei ˚ 0f(0;i)j0i+ r 0f(0;i) r0 max ei˚ j1i! (h˚ji) = 1 d X i v ji 0i r j maxr 0 max = hv 0jv ji dr

QML

Amrit Singhal

Introduction

Defining theproblem

Various Metrics

Some usefulsubmodules

AmplitudeEstimation

Durr-HoyerMinimumFinding

SWAP Test

Quantum a + 1

Inner ProductMethod

Notations

Algorithm

CoherentAmplitudeestimation

HammingDistanceMethod

Prerequisites

Algorithm

Performance

Inner Product Nearest Neighbour Algorithm

Lemma (1)

The state1√d|j〉∑d

i=1 |f (j , i)〉(√

1−r2jf (j,i)

r2j max

e−iφjf (j,i) |0〉+rjf (j,i)

rj maxe iφjf (j,i) |1〉

)can be implemented using 3 oracle calls to O and F .

Page 16: Quantum Machine Learning - Nearest Neighbour …...Then, r 1 r2 0f(0;i) r2 0 max ei ˚ 0f(0;i)j0i+ r 0f(0;i) r0 max ei˚ j1i! (h˚ji) = 1 d X i v ji 0i r j maxr 0 max = hv 0jv ji dr

QML

Amrit Singhal

Introduction

Defining theproblem

Various Metrics

Some usefulsubmodules

AmplitudeEstimation

Durr-HoyerMinimumFinding

SWAP Test

Quantum a + 1

Inner ProductMethod

Notations

Algorithm

CoherentAmplitudeestimation

HammingDistanceMethod

Prerequisites

Algorithm

Performance

Inner Product Nearest Neighbour Algorithm

Proof:

1 Start with the state |j〉 |0〉 |0〉 |0〉.2 Apply Hadamard (H⊗[log d ]) on second register to obtain

1√d

∑di=1 |j〉 |i〉 |0〉 |0〉.

3 Apply oracle F to obtain 1√d

∑di=1 |j〉 |f (j , i)〉 |0〉 |0〉.

4 Query from O to obtain 1√d

∑di=1 |j〉 |f (j , i)〉

∣∣vjf (j ,i)

⟩|0〉.

5 Apply Ry (2sin−1(rjf (j ,i)/rj max)) on the last qubit to obtain

1√d

∑di=1 |j〉 |f (j , i)〉

∣∣vjf (j ,i)

⟩(√1−

r2jf (j,i)

r2j max

|0〉+rjf (j,i)

rj max|1〉)

6 Apply Rz(2φjf (j ,i)) to the last qubit, followed by O† forcleaning the ancilla register, to obtain the required state.

Page 17: Quantum Machine Learning - Nearest Neighbour …...Then, r 1 r2 0f(0;i) r2 0 max ei ˚ 0f(0;i)j0i+ r 0f(0;i) r0 max ei˚ j1i! (h˚ji) = 1 d X i v ji 0i r j maxr 0 max = hv 0jv ji dr

QML

Amrit Singhal

Introduction

Defining theproblem

Various Metrics

Some usefulsubmodules

AmplitudeEstimation

Durr-HoyerMinimumFinding

SWAP Test

Quantum a + 1

Inner ProductMethod

Notations

Algorithm

CoherentAmplitudeestimation

HammingDistanceMethod

Prerequisites

Algorithm

Performance

Inner Product Nearest Neighbour Algorithm

Now, to get the inner product between two states vj and v0,prepare the states

|ψ〉 = 1√d

∑di=1 |f (j , i)〉

(√1−

r2jf (j,i)

r2j max

e−iφjf (j,i) |0〉+rjf (j,i)

rj maxe iφjf (j,i) |1〉

)|1〉

|φ〉 = 1√d

∑di=1 |f (0, i)〉 |1〉

(√1−

r20f (0,i)

r20 max

e−iφ0f (0,i) |0〉+r0f (0,i)

r0 maxe iφ0f (0,i) |1〉

)Then,

(〈φ|ψ〉) =1

d

∑i

vjiv∗0i

rj max r0 max=

〈v0|vj〉drj max r0 max

Now, perform the SWAP test, where the probability of gettingoutcome 0 is P(0).

Page 18: Quantum Machine Learning - Nearest Neighbour …...Then, r 1 r2 0f(0;i) r2 0 max ei ˚ 0f(0;i)j0i+ r 0f(0;i) r0 max ei˚ j1i! (h˚ji) = 1 d X i v ji 0i r j maxr 0 max = hv 0jv ji dr

QML

Amrit Singhal

Introduction

Defining theproblem

Various Metrics

Some usefulsubmodules

AmplitudeEstimation

Durr-HoyerMinimumFinding

SWAP Test

Quantum a + 1

Inner ProductMethod

Notations

Algorithm

CoherentAmplitudeestimation

HammingDistanceMethod

Prerequisites

Algorithm

Performance

Inner Product Nearest Neighbour Algorithm

Then the inner product is

(〈φ|ψ〉)2 = 2P(0)− 1

=⇒ (〈v0|vj〉)2 = (2P(0)− 1)d2r4max

So, estimating P(0) will provide us with a measure for theinner product.

We can obtain the required value through statisticalsampling, but that would require O(M/ε2) number ofqueries.

Instead, we use Amplitude Estimation to do it withO(1/ε) scaling.

Page 19: Quantum Machine Learning - Nearest Neighbour …...Then, r 1 r2 0f(0;i) r2 0 max ei ˚ 0f(0;i)j0i+ r 0f(0;i) r0 max ei˚ j1i! (h˚ji) = 1 d X i v ji 0i r j maxr 0 max = hv 0jv ji dr

QML

Amrit Singhal

Introduction

Defining theproblem

Various Metrics

Some usefulsubmodules

AmplitudeEstimation

Durr-HoyerMinimumFinding

SWAP Test

Quantum a + 1

Inner ProductMethod

Notations

Algorithm

CoherentAmplitudeestimation

HammingDistanceMethod

Prerequisites

Algorithm

Performance

Inner Product Nearest Neighbour Algorithm

Also, to reduce the scaling with M by a quadratic factor, weuse the Durr-Hoyer minimum finding algorithm, whichcombines Grover’s algorithm with exponential searching to findthe smallest (or largest) element.But the normal amplitude estimation is irreversible. In order toapply this quadratic reduction, we need to make AE reversible.This is called coherent amplitude estimation.

Page 20: Quantum Machine Learning - Nearest Neighbour …...Then, r 1 r2 0f(0;i) r2 0 max ei ˚ 0f(0;i)j0i+ r 0f(0;i) r0 max ei˚ j1i! (h˚ji) = 1 d X i v ji 0i r j maxr 0 max = hv 0jv ji dr

QML

Amrit Singhal

Introduction

Defining theproblem

Various Metrics

Some usefulsubmodules

AmplitudeEstimation

Durr-HoyerMinimumFinding

SWAP Test

Quantum a + 1

Inner ProductMethod

Notations

Algorithm

CoherentAmplitudeestimation

HammingDistanceMethod

Prerequisites

Algorithm

Performance

Coherent Amplitude Estimation

Normal AE output a state√a |y〉+

√1− |a|

∣∣y⊥⟩, where |y〉 is

a bit-string that encodes P(0).

Consider AE as a blackbox. Take A to be a unitary that maps|0⊗n〉 →

√a |y〉+

√1− |a|

∣∣y⊥⟩, for 1/2 ≤ |a0| ≤ |a| ≤ 1.Then, for any ∆ > 0, ∃ integer k such that the following

algorithm produces a state |Ψ〉 such that

∣∣∣∣∣∣∣∣ |Ψ〉 − ∣∣0⊗nk⟩ |y〉 ∣∣∣∣∣∣∣∣2

Page 21: Quantum Machine Learning - Nearest Neighbour …...Then, r 1 r2 0f(0;i) r2 0 max ei ˚ 0f(0;i)j0i+ r 0f(0;i) r0 max ei˚ j1i! (h˚ji) = 1 d X i v ji 0i r j maxr 0 max = hv 0jv ji dr

QML

Amrit Singhal

Introduction

Defining theproblem

Various Metrics

Some usefulsubmodules

AmplitudeEstimation

Durr-HoyerMinimumFinding

SWAP Test

Quantum a + 1

Inner ProductMethod

Notations

Algorithm

CoherentAmplitudeestimation

HammingDistanceMethod

Prerequisites

Algorithm

Performance

Coherent Amplitude Estimation

Prepare k copies of the state√a |y〉+

√1− |a|

∣∣y⊥⟩. So

we start with state(√

a |y〉+√

1− |a|∣∣y⊥⟩)⊗k .

Partition this state into two: |Ψ〉, the sum of states withmedian y , and |Φ〉, the sum of states with median notequal to y .(√

a |y〉+√

1− |a|∣∣y⊥⟩)⊗k = A |Ψ〉+

√1− |A|2 |Φ〉

Compute the median through the unitary

M : |y1〉 |y2〉 · · · |yk〉 |0〉 → |y1〉 |y2〉 · · · |yk〉 |y〉

where y is the median.This transformation can be performed by implementing asort algorithm using O(kn log(k) operations.

Page 22: Quantum Machine Learning - Nearest Neighbour …...Then, r 1 r2 0f(0;i) r2 0 max ei ˚ 0f(0;i)j0i+ r 0f(0;i) r0 max ei˚ j1i! (h˚ji) = 1 d X i v ji 0i r j maxr 0 max = hv 0jv ji dr

QML

Amrit Singhal

Introduction

Defining theproblem

Various Metrics

Some usefulsubmodules

AmplitudeEstimation

Durr-HoyerMinimumFinding

SWAP Test

Quantum a + 1

Inner ProductMethod

Notations

Algorithm

CoherentAmplitudeestimation

HammingDistanceMethod

Prerequisites

Algorithm

Performance

Coherent Amplitude Estimation

M(√

a |y〉+√

1− |a|∣∣y⊥⟩)⊗k = A |Ψ〉 |y〉+

√1− |A|2 |Φ〉

∣∣y⊥⟩Apply A†⊗k to the first register:

A†⊗k(A |Ψ〉 |y〉+

√1− |A|2 |Φ〉

∣∣y⊥⟩)= A†⊗k

(A |Ψ〉 |y〉+

√1− |A|2 |Φ〉 |y〉

)+

A†⊗k(√

1− |A|2 |Φ〉 |y〉 − |Ψ〉 |y〉)

=∣∣0⊗nk⟩ |y〉+A†⊗k

(√1− |A|2 |Φ〉 |y〉 − |Ψ〉 |y〉

)=⇒

∣∣∣∣A†⊗k (A |Ψ〉 |y〉+√

1− |A|2 |Φ〉∣∣y⊥⟩)− ∣∣0⊗nk⟩ |y〉 ∣∣∣∣ ≤√2(1− |A|2)

So, we need P(y⊥) = 1− |A|2 ≤ ∆

Page 23: Quantum Machine Learning - Nearest Neighbour …...Then, r 1 r2 0f(0;i) r2 0 max ei ˚ 0f(0;i)j0i+ r 0f(0;i) r0 max ei˚ j1i! (h˚ji) = 1 d X i v ji 0i r j maxr 0 max = hv 0jv ji dr

QML

Amrit Singhal

Introduction

Defining theproblem

Various Metrics

Some usefulsubmodules

AmplitudeEstimation

Durr-HoyerMinimumFinding

SWAP Test

Quantum a + 1

Inner ProductMethod

Notations

Algorithm

CoherentAmplitudeestimation

HammingDistanceMethod

Prerequisites

Algorithm

Performance

Coherent Amplitude Estimation

Now, in any sequence of measurements that contain more thank/2 y -outcomes, the median must be y .

P(y⊥)≤P(no more than k/2 y -outcomes)

=

k/2∑i=0

(kp

)|a|p|1− |a||k−p

Hoeffding’s Inequality

For a Bernoulli’s Disrtibution:P(H(n) ≤ (p − ε)n) ≤ exp(−2ε2n)

Also, we know that |a| > |a0| > 1/2. Thus,

P(y⊥) ≤ exp(−2k

(|a0| − 1

2

)2)≤ ∆ =⇒ k ≤ ln( 1

∆ )2(|a0|− 1

2 )2

Page 24: Quantum Machine Learning - Nearest Neighbour …...Then, r 1 r2 0f(0;i) r2 0 max ei ˚ 0f(0;i)j0i+ r 0f(0;i) r0 max ei˚ j1i! (h˚ji) = 1 d X i v ji 0i r j maxr 0 max = hv 0jv ji dr

QML

Amrit Singhal

Introduction

Defining theproblem

Various Metrics

Some usefulsubmodules

AmplitudeEstimation

Durr-HoyerMinimumFinding

SWAP Test

Quantum a + 1

Inner ProductMethod

Notations

Algorithm

CoherentAmplitudeestimation

HammingDistanceMethod

Prerequisites

Algorithm

Performance

From feature vector to bit-string

Well-defined hash functions are used to convert featurevectors into bit vectors or bit strings.

”Simple KNN classifiers in Hamming space arecompetetive with sophisticated discriminative classifiers,including SVMs and neural networks.”

Page 25: Quantum Machine Learning - Nearest Neighbour …...Then, r 1 r2 0f(0;i) r2 0 max ei ˚ 0f(0;i)j0i+ r 0f(0;i) r0 max ei˚ j1i! (h˚ji) = 1 d X i v ji 0i r j maxr 0 max = hv 0jv ji dr

QML

Amrit Singhal

Introduction

Defining theproblem

Various Metrics

Some usefulsubmodules

AmplitudeEstimation

Durr-HoyerMinimumFinding

SWAP Test

Quantum a + 1

Inner ProductMethod

Notations

Algorithm

CoherentAmplitudeestimation

HammingDistanceMethod

Prerequisites

Algorithm

Performance

Setup

Given: feature vectors |vp〉 , p ∈ 1, · · · ,N andcorresponding class cp ∈ 1, · · · , lConstruct the training superposition

|T 〉 =1√N

N∑p=1

∣∣vp1 , · · · , vpn , cp⟩You have a testing sample |x1, · · · , xn〉.

Page 26: Quantum Machine Learning - Nearest Neighbour …...Then, r 1 r2 0f(0;i) r2 0 max ei ˚ 0f(0;i)j0i+ r 0f(0;i) r0 max ei˚ j1i! (h˚ji) = 1 d X i v ji 0i r j maxr 0 max = hv 0jv ji dr

QML

Amrit Singhal

Introduction

Defining theproblem

Various Metrics

Some usefulsubmodules

AmplitudeEstimation

Durr-HoyerMinimumFinding

SWAP Test

Quantum a + 1

Inner ProductMethod

Notations

Algorithm

CoherentAmplitudeestimation

HammingDistanceMethod

Prerequisites

Algorithm

Performance

Algorithm

Step 1: Prepare the state

|φ1〉 =1√N

N∑p=1

∣∣x1, · · · , xn; vp1 , · · · , vpn , c

p; 0⟩

Step 2: Record the difference between training and test vector,and store the result reversed in first register.

|φ2〉 =∏k

X (xk)CNOT (xk , vpk ) |φ1〉

=1√N

N∑p=1

∣∣d1, · · · , dn; vp1 , · · · , vpn , c

p; 0⟩

(CNOT(a,b) overwrites a with 0 if a = b, else 1.)

Page 27: Quantum Machine Learning - Nearest Neighbour …...Then, r 1 r2 0f(0;i) r2 0 max ei ˚ 0f(0;i)j0i+ r 0f(0;i) r0 max ei˚ j1i! (h˚ji) = 1 d X i v ji 0i r j maxr 0 max = hv 0jv ji dr

QML

Amrit Singhal

Introduction

Defining theproblem

Various Metrics

Some usefulsubmodules

AmplitudeEstimation

Durr-HoyerMinimumFinding

SWAP Test

Quantum a + 1

Inner ProductMethod

Notations

Algorithm

CoherentAmplitudeestimation

HammingDistanceMethod

Prerequisites

Algorithm

Performance

Algorithm

Let the Hamming distance threshold value = tNow, Hamming distance between |x〉 and |vp〉 = n −

∑ni=1 d

pi

=⇒ n −n∑

i=1

dpi ≤ t

Suppose 2k−1 ≤ n ≤ 2k , then define m = 2k − n.

=⇒n∑

i=1

dpi + m + t ≥ 2k

So, set initial a = m + t. Then the condition HD ≤ t can bedetermined by whether the addition of

∑ni=1 d

pi + a overflows

or not.

Page 28: Quantum Machine Learning - Nearest Neighbour …...Then, r 1 r2 0f(0;i) r2 0 max ei ˚ 0f(0;i)j0i+ r 0f(0;i) r0 max ei˚ j1i! (h˚ji) = 1 d X i v ji 0i r j maxr 0 max = hv 0jv ji dr

QML

Amrit Singhal

Introduction

Defining theproblem

Various Metrics

Some usefulsubmodules

AmplitudeEstimation

Durr-HoyerMinimumFinding

SWAP Test

Quantum a + 1

Inner ProductMethod

Notations

Algorithm

CoherentAmplitudeestimation

HammingDistanceMethod

Prerequisites

Algorithm

Performance

Algorithm

Page 29: Quantum Machine Learning - Nearest Neighbour …...Then, r 1 r2 0f(0;i) r2 0 max ei ˚ 0f(0;i)j0i+ r 0f(0;i) r0 max ei˚ j1i! (h˚ji) = 1 d X i v ji 0i r j maxr 0 max = hv 0jv ji dr

QML

Amrit Singhal

Introduction

Defining theproblem

Various Metrics

Some usefulsubmodules

AmplitudeEstimation

Durr-HoyerMinimumFinding

SWAP Test

Quantum a + 1

Inner ProductMethod

Notations

Algorithm

CoherentAmplitudeestimation

HammingDistanceMethod

Prerequisites

Algorithm

Performance

Algorithm

Step 3: Define

Ω = p| Hamming distance between |x〉 and |vp〉 is < t

Then, let U be the unitary such that

|φ3〉 = U |φ2〉

= 1√N

(∑p∈Ω

∣∣d1, · · · , dn; vp1 , · · · , vpn , cp; 1

⟩+∑

p/∈Ω

∣∣d1, · · · , dn; vp1 , · · · , vpn , cp; 0

⟩)

Page 30: Quantum Machine Learning - Nearest Neighbour …...Then, r 1 r2 0f(0;i) r2 0 max ei ˚ 0f(0;i)j0i+ r 0f(0;i) r0 max ei˚ j1i! (h˚ji) = 1 d X i v ji 0i r j maxr 0 max = hv 0jv ji dr

QML

Amrit Singhal

Introduction

Defining theproblem

Various Metrics

Some usefulsubmodules

AmplitudeEstimation

Durr-HoyerMinimumFinding

SWAP Test

Quantum a + 1

Inner ProductMethod

Notations

Algorithm

CoherentAmplitudeestimation

HammingDistanceMethod

Prerequisites

Algorithm

Performance

Algorithm

Step 4: Define Γ = I⊗ |1〉 〈1|. Obtain:

|φ4〉 = Γ |φ3〉 = α∑p∈Ω

∣∣d1, · · · , dn; vp1 , · · · , vpn , c

p; 1⟩

such that∑|Ω|2

i=1 |α|2 = 1α is the renormalised amplitude of each component of φ4.

|φ4〉 is composed of |vp〉 whose distance are no more than t tothe testing sample. Measure cp alone to get the category ofthe test sample |x〉.

Page 31: Quantum Machine Learning - Nearest Neighbour …...Then, r 1 r2 0f(0;i) r2 0 max ei ˚ 0f(0;i)j0i+ r 0f(0;i) r0 max ei˚ j1i! (h˚ji) = 1 d X i v ji 0i r j maxr 0 max = hv 0jv ji dr

QML

Amrit Singhal

Introduction

Defining theproblem

Various Metrics

Some usefulsubmodules

AmplitudeEstimation

Durr-HoyerMinimumFinding

SWAP Test

Quantum a + 1

Inner ProductMethod

Notations

Algorithm

CoherentAmplitudeestimation

HammingDistanceMethod

Prerequisites

Algorithm

Performance

Performance

The Inner Product Method runs in O(√N lgN), and

requires the feature vectors to be sparse.

The Hamming Distance Method runs works over thetraining set superposition, and so its cost is independantof the number of training points.

It takes O(n3) time to execute.

In Big Data scenarios, generally N >> n, and so HammingDistance outperforms Inner Product Method in speed.

Page 32: Quantum Machine Learning - Nearest Neighbour …...Then, r 1 r2 0f(0;i) r2 0 max ei ˚ 0f(0;i)j0i+ r 0f(0;i) r0 max ei˚ j1i! (h˚ji) = 1 d X i v ji 0i r j maxr 0 max = hv 0jv ji dr

QML

Amrit Singhal

Introduction

Defining theproblem

Various Metrics

Some usefulsubmodules

AmplitudeEstimation

Durr-HoyerMinimumFinding

SWAP Test

Quantum a + 1

Inner ProductMethod

Notations

Algorithm

CoherentAmplitudeestimation

HammingDistanceMethod

Prerequisites

Algorithm

Performance

Performance

Page 33: Quantum Machine Learning - Nearest Neighbour …...Then, r 1 r2 0f(0;i) r2 0 max ei ˚ 0f(0;i)j0i+ r 0f(0;i) r0 max ei˚ j1i! (h˚ji) = 1 d X i v ji 0i r j maxr 0 max = hv 0jv ji dr

QML

Amrit Singhal

Introduction

Defining theproblem

Various Metrics

Some usefulsubmodules

AmplitudeEstimation

Durr-HoyerMinimumFinding

SWAP Test

Quantum a + 1

Inner ProductMethod

Notations

Algorithm

CoherentAmplitudeestimation

HammingDistanceMethod

Prerequisites

Algorithm

Performance

The End

Thank You!