Published on Fri Oct 31 2014

Symmetric low-rank representation for subspace clustering

Jie Chen, Haixian Zhang, Hua Mao, Yongsheng Sang, Zhang Yi

The SLRR method can reveal the membership of multiple subspaces through the self-expressiveness property of the data. Extensive experimental results show that it outperforms state-of-the-art subspace clustering algorithms.

0
0
0
Abstract

We propose a symmetric low-rank representation (SLRR) method for subspace clustering, which assumes that a data set is approximately drawn from the union of multiple subspaces. The proposed technique can reveal the membership of multiple subspaces through the self-expressiveness property of the data. In particular, the SLRR method considers a collaborative representation combined with low-rank matrix recovery techniques as a low-rank representation to learn a symmetric low-rank representation, which preserves the subspace structures of high-dimensional data. In contrast to performing iterative singular value decomposition in some existing low-rank representation based algorithms, the symmetric low-rank representation in the SLRR method can be calculated as a closed form solution by solving the symmetric low-rank optimization problem. By making use of the angular information of the principal directions of the symmetric low-rank representation, an affinity graph matrix is constructed for spectral clustering. Extensive experimental results show that it outperforms state-of-the-art subspace clustering algorithms.

Fri Mar 07 2014
Computer Vision
Subspace clustering using a symmetric low-rank representation
A low-rank representation with symmetric constraint (LRRSC) is a method for robust subspace clustering. LRRSC extends the original low- Rank representation algorithm by integrating a Symmetric constraint. Experimental results on benchmark databases demonstrate effectiveness and robustness of L RRSC.
0
0
0
Sat Dec 21 2019
Machine Learning
Research on Clustering Performance of Sparse Subspace Clustering
Clustering has been a valid tool to deal with high-dimensional data. There are two essential steps in the framework of sparse subspace clustering. Both the coefficient matrix and affinity matrix have a huge influence on performance.
0
0
0
Fri Aug 02 2019
Machine Learning
Large-Scale Sparse Subspace Clustering Using Landmarks
Subspace clustering methods based on expressing each data point as a linear.combination of all other points in a dataset are popular unsupervised learning techniques. existing methods incur high computational complexity on large-scale datasets as they require solving an expensive optimization problem.
0
0
0
Fri Sep 29 2017
Machine Learning
A Nonlinear Orthogonal Non-Negative Matrix Factorization Approach to Subspace Clustering
The proposed NMF-based approach to subspace clustering takes into account the nonlinear nature of the manifold, as well as its intrinsic local geometry. The proposed algorithm considerably improves the clustering performance when compared to the several recently proposed state-of-the-art methods.
0
0
0
Wed Oct 12 2016
Computer Vision
Subspace clustering based on low rank representation and weighted nuclear norm minimization
Subspace clustering refers to the problem of segmenting a set of data points from a union of multiple linear subspaces. Low rank representation method seeks the lowest rank representation among all the candidates. Nuclear norm minimization is adopted to minimize the rank of the representation matrix.
0
0
0
Mon Oct 24 2016
Computer Vision
Laplacian regularized low rank subspace clustering
The problem of fitting a union of subspaces to a collection of data points is considered in this paper. In the traditional low rank representation model, the dictionary used to represent the data points themselves is corrupted with noise. This problem is solved in the low rank subspace clustering model.
0
0
0