Algorithm Development with Higher Order SVD

Embed Size (px)

DESCRIPTION

Tensors N-Mode Generalization of a Matrix Notation: T In is the length of the tensor in mode (i.e. dimension) n A tensor has order N Two basic operations: “unfolding” (T(n)) and the “mode-n product of a tensor and a matrix” (Tx(n)M) I1xI2x…xIN

Citation preview

Algorithm Development with Higher Order SVD
Algorithms Meeting September 16, 2004 Algorithm Development with Higher Order SVD Adam Goodenough Tensors N-Mode Generalization of a Matrix Notation: T
In is the length of the tensor in mode (i.e. dimension) n A tensor has order N Two basic operations: unfolding (T(n)) and the mode-n product of a tensor and a matrix (Tx(n)M) I1xI2xxIN Example 5 1 2 4 3 Mode-3 Tensor 5 1 2 4 3 I2 I1 I3 Unfolding Unfolding flattens an N-mode tensor into a 2-mode tensor
The operation, T(n), unfolds the matrix along mode n Each column of the resulting tensor is composed of twhere in varies and the remaining indices are held constant (for a column) i1i2iniN Example - Unfolding Mode-1 Unfolding Example - Unfolding Mode-2 Unfolding Example - Unfolding Mode-3 Unfolding Tensor Matrix Product
The operation, Tx(n)M, denotes the product of an Nth order Tensor with a Matrix (2nd order Tensor) Example - Product 5 1 2 4 3 1 5 2 3 4 Matrix PCA Two standard ways of performing Principal Component Analysis: Find the eigenvalues/eigenvectors of the covariance matrix Take the SVD of the raw data The basis vectors are weighted by the eigen/singular values The SVD approach can be easily extended to Tensors Matrix Decomposition I1 I2 Data Matrix = J1 U J2 S V T Matrix Decomposition I1 I2 Data Matrix A Tensor! Matrix Decomposition I1 J1 U I2 J2 V Data Matrix = T S Matrix Decomposition U1 U2 I1 I2 Data Matrix = J1 U J2 S V T I1 J1 U Matrix SVD Tensor SVD HOSVD Given a tensor, T, of order N:
Each Un is obtained from the SVD of the unfolding of T along mode n Z is a full tensor (i.e. not equivalent to the diagonal S matrix) HOSVD The tensor, Z, is found by Z has dimensionality equivalent to T
i.e. Given Algorithm Applications
Three general areas: Prediction Detection Compression All algorithms work through building a training data sets by varying parameters (mode indices) Applications - Prediction
Tensor Textures (Vasilescu and Terzopoulos 2004) Applications - Prediction
Data set: DTx I x V Rasterized Texture Images: T = 240x320x3 = 23040 Illumination Orientations: I = 21 View Orientations: V = 37 Total of 777 images taken (21x37) Tangent Normal PCA of this data set would perform SVD on the matrix of all observations This matrix is equivalent to the mode-1 unfolding of the tensor, i.e. Tangent This means that the SVD of the data set is related to Utexel
It turns out that R is constructed by the Z, Uillum, Uview terms (using the Kronecker product) We end up having explicit control over the eigenvalues (compression) Applications - Prediction
i, v are computed based on nearest neighbors Applications - Prediction
MODTRAN (MakeADB) Applications - Detection
Tensor Faces (Vasilescu and Terzopoulos 2002) Applications - Detection
Vast improvement over PCA based techniques Applications - Detection
Attempts on plume data Applications - Compression
HOSVD allows for explicit control over dimensionality reduction Applications - Compression
Compression can be done so that reconstruction results are better perceptually (though usually worse RMSE) PCA HOSVD Applications - Compression
Can having explicit control over dimensionality reduction allow us to perform invariant based detection algorithms better? Goal is to use photon mapping to generate spectral images of underwater targets given varying environmental conditions and to test traditional PCA based invariant algorithms versus HOSVD based invariant algorithms