Published on Thu Sep 12 2013

Efficient Orthogonal Tensor Decomposition, with an Application to Latent Variable Model Learning

Franz J. Király

Decomposing tensors into orthogonal factors is a well-known task inistics, machine learning, and signal processing. We show that it is a non-trivial assumption for a tensor to have such an Orthogonal Decomposition.

0
0
0
Abstract

Decomposing tensors into orthogonal factors is a well-known task in statistics, machine learning, and signal processing. We study orthogonal outer product decompositions where the factors in the summands in the decomposition are required to be orthogonal across summands, by relating this orthogonal decomposition to the singular value decompositions of the flattenings. We show that it is a non-trivial assumption for a tensor to have such an orthogonal decomposition, and we show that it is unique (up to natural symmetries) in case it exists, in which case we also demonstrate how it can be efficiently and reliably obtained by a sequence of singular value decompositions. We demonstrate how the factoring algorithm can be applied for parameter identification in latent variable and mixture models.