15
Linked CP Tensor Decomposition and its Fast Implementation for Multi - block Tensor Analysis November 12-15, 2012 ICONIP2012 RIKEN BSI & Tokyo Institute of Technology Tatsuya Yokota RIKEN BSI Cichocki Andzej Tokyo Institute of Technology Yukihiko Yamashita 1

Linked CP Tensor Decomposition (presented by ICONIP2012)

Embed Size (px)

Citation preview

Linked CP Tensor Decomposition

and its Fast Implementation

for Multi-block Tensor Analysis

November 12-15, 2012

ICONIP2012

RIKEN BSI & Tokyo Institute of Technology

Tatsuya YokotaRIKEN BSI

Cichocki AndzejTokyo Institute of Technology

Yukihiko Yamashita

1

Group data analysis is a very important topic for signal

processing research.

We have two matrices of data from different subjects.

We propose a new method to analyze the common and individual factors

from a group of tensor data.

It is called the “Linked Tensor Decomposition” (LTD)

2

Introduction

Subject 1 Subject 2

• What is a common factor between two subjects?

• Conversely, what is an individual factor?

We can see that many things consist of

[common factors] + [individual factors].

For example

Everybody have a face which consists of a mouse, a nose, two eyes,

two ears and so on. (common factor)

But these shapes and positions are strictly different based on their

personalities. (individual factor)

It is meaningful to extract their factors from data set.

Goal of this work is to analyze such a common and individual factors from

given data set.

Concept of the LTD

+

+

3common individualface

Introduction

Traditional models of Group Tensor Analysis

Proposed model

Method & Algorithm

Experiments

4

Outline

When we analyze a single matrix data

SVD, PCA, NMF, ICA etc work well for each objective.

When we analyze a group of matrix data

There are two traditional approaches.

5

Group Tensor Data Analysis

~=

T

Y A B

(I × J) (I × R) (R × J)

Y1 Y2 YK

~=

(I × J ×K) (I × R) (R × J)

(R ×K)

T

A BC

~=

~=

~=

① Individual tensor decomposition (ITD) ② Simultaneous tensor decomposition (STD)

A1

A2

AK

B1T

B2T

BKT

(R × R)

D

D :diagonal weighting

parameter matrix

A, B : factor matrix

ITD

analyze individual characteristics

doesn’t treat data as a group

STD

analyze the common factor in a group

weak flexibility, and can’t analyze individual characteristics

Properties of traditional approaches

① ITD: Individual Tensor Decomposition ② STD: Simultaneous Tensor Decomposition

6

~=

(I × J ×K) (I × R) (R × J)

(R ×K)

TA B

C

c1

~=T

A BY1

~=T

A BY2

c2

~=T

A BYK

cK

~=T

A1 B1Y1

~=T

A2 B2Y2

~=T

AK BKYK

We propose a new model which can combine both merits

In this work, we propose a new flexible model to combine ITD and STD

models.

ITD + STD “Linked Tensor Decomposition” (LTD)

LTD can analyze common and individual factors, simultaneously.

7

A New Flexible Model

~= TA1B1

Y1

~= TA2B2

Y2

~= TAKBK

YK

AC

AC

AC

TBC

TBC

TBC

AC, BC : Common factor

Ak, Bk, k = 1, … , K : Individual factor

AkAC BkBC

L1 R1 L2 R2

Formalized Problem

LTD model: Mk, k=1,…,K

The problem is to minimize .

Algorithm: Hierarchical Alternating Least Square (HALS)

It doesn’t require matrix inversion and solved by only simple calculation

Iterating a partial optimization (very briefly description ↓)

For r=1, … , R

end 8

Problem & Algorithm

TAkBk

AC

TBC

A(k)

B(k)T

D(k)

d1(k)

d2(k)

dR(k)

Given Data

Results

Toy problem

Rank-1 factorization

9

Experiments

Common factor Individual factor

Unknown Sources (rank-1)

Generate with noise

Factorize

by the LTD

Common factor Individual factor

low

high

Given Data

Results

Toy problem

Rank-3 factorization (good case)

10

Experiments

Common factor Individual factor

Unknown Sources (rank-3)

Generate with noise

Factorize

by the LTD

Common factor Individual factor

low

high

Given Data

Results

Toy problem

Rank-3 factorization (bad case: solution is not unique)

11

Experiments

Common factor Individual factor

Unknown Sources (rank-3)

Generate with noise

Factorize

by the LTD

Common factor Individual factor

low

high

Face Reconstruction (Denoising)

12

Experiments

+

Common terms Individual terms

Common basis

j=1,..,5

j=6,..,10

j=11,..,15

j=16,..,20

+

+

=

=

L-rank R-rank

originaladding

noise LTD

Face Reconstruction (Denoising)

Measuring PSNR of results for various number of bases.

ITD good area is narrow, it is difficult to select the optimal number of bases

STD we can see large number is good, but computational cost is very high

LTD good area is wide and in small number of bases, computational efficient 13

Experiments

Number of bases (L+R)

PSN

R [

dB

]

ITD model (all individual bases, L=0)

LTD model (L : R = 4 : 1)

STD model (all common bases, R=0)

Input data (noisy)

Summary

We proposed a new tensor decomposition model called the Linked

Tensor Decomposition based on CP decomposition.

We developed an algorithm for the LTD.

We conducted experiments and showed the advantage of LTD.

Future works

Tucker based LTD.

A case of that common factor

are not identical, but similar.

Considering the statistical

independency in the LTD

model.

14

Summary & Future works

Thank you for your attention !!! (Q&A)

15