Published on Tue Dec 06 2016

Deterministic and Probabilistic Conditions for Finite Completability of Low-Tucker-Rank Tensor

Morteza Ashraphijuo, Vaneet Aggarwal, Xiaodong Wang

We investigate the fundamental conditions on the sampling pattern, i.e.,locations of the sampled entries, for finite completability of a low-rank tensor. In order to find the necessary and sufficient conditions, we propose an algebraic geometric analysis on the Tucker manifold.

0
0
0
Abstract

We investigate the fundamental conditions on the sampling pattern, i.e., locations of the sampled entries, for finite completability of a low-rank tensor given some components of its Tucker rank. In order to find the deterministic necessary and sufficient conditions, we propose an algebraic geometric analysis on the Tucker manifold, which allows us to incorporate multiple rank components in the proposed analysis in contrast with the conventional geometric approaches on the Grassmannian manifold. This analysis characterizes the algebraic independence of a set of polynomials defined based on the sampling pattern, which is closely related to finite completion. Probabilistic conditions are then studied and a lower bound on the sampling probability is given, which guarantees that the proposed deterministic conditions on the sampling patterns for finite completability hold with high probability. Furthermore, using the proposed geometric approach for finite completability, we propose a sufficient condition on the sampling pattern that ensures there exists exactly one completion for the sampled tensor.

Wed Mar 22 2017
Machine Learning
Characterization of Deterministic and Probabilistic Sampling Patterns for Finite Completability of Low Tensor-Train Rank Tensor
In this paper, we analyze the fundamental conditions for low-rank tensor completion given the separation or tensor-train (TT) rank. We exploit the algebraic structure of the TT decomposition to obtain the deterministic necessary and sufficient conditions on the locations of the samples.
0
0
0
Fri Mar 31 2017
Machine Learning
Fundamental Conditions for Low-CP-Rank Tensor Completion
We consider the problem of low canonical polyadic (CP) rank tensor completion. A completion is a tensor whose entries agree with the observed CP rank. We show that finite completability of the sampled tensor is equivalent to having a certain number of algebraically independent polynomials.
0
0
0
Mon Jul 03 2017
Machine Learning
Rank Determination for Low-Rank Data Completion
In this paper, we consider the scenario where the rank is not given and we aim to approximate the unknown rank based on the location of sampled entries and some given completion. We consider a number of data models, including single-view matrix, multi-viewMatrix, CP tensor,
0
0
0
Wed Jun 11 2014
Machine Learning
Provable Tensor Factorization with Missing Data
We study the problem of low-rank tensor factorization in the presence of missing data. We propose a novel alternating minimization based on iteratively refines estimates of the singular vectors. We show that under certain standard assumptions, our method can recover a three-mode dimensional rank-
0
0
0
Mon Mar 09 2015
Machine Learning
A Characterization of Deterministic Sampling Patterns for Low-Rank Matrix Completion
Low-rank matrix completion (LRMC) problems arise in a wide variety of applications. Finite completability is the tipping point in LRMC, as a few additional samples of a finitely completable matrix guarantee its unique completability.
1
0
0
Fri Dec 23 2016
Machine Learning
Spectral algorithms for tensor completion
In the tensor completion problem, one seeks to estimate a low-rank tensor based on a random sample of revealed entries. Earlier work revealed a large gap between estimation with unbounded computational resources (using, for instance, tensor nuclear norm minimization) and polynomial-time algorithms. We propose a new unfolding-based method, which outperforms naive ones.
0
0
0