Published on Tue Mar 25 2014

Spectral Sparse Representation for Clustering: Evolved from PCA, K-means, Laplacian Eigenmap, and Ratio Cut

Zhenfang Hu, Gang Pan, Yueming Wang, Zhaohui Wu

spectral graph theory unifies a series of elementary methods and can unify them into acomplete framework. The methods include PCA, K-means, Laplacian eigenmap (LE), and a new sparse representation method. SSR combines merits of both sides: its sparse codes reduce dimensionality of data while revealing cluster structure.

0
0
0
Abstract

Dimensionality reduction, cluster analysis, and sparse representation are basic components in machine learning. However, their relationships have not yet been fully investigated. In this paper, we find that the spectral graph theory underlies a series of these elementary methods and can unify them into a complete framework. The methods include PCA, K-means, Laplacian eigenmap (LE), ratio cut (Rcut), and a new sparse representation method developed by us, called spectral sparse representation (SSR). Further, extended relations to conventional over-complete sparse representations (e.g., method of optimal directions, KSVD), manifold learning (e.g., kernel PCA, multidimensional scaling, Isomap, locally linear embedding), and subspace clustering (e.g., sparse subspace clustering, low-rank representation) are incorporated. We show that, under an ideal condition from the spectral graph theory, PCA, K-means, LE, and Rcut are unified together. And when the condition is relaxed, the unification evolves to SSR, which lies in the intermediate between PCA/LE and K-mean/Rcut. An efficient algorithm, NSCrt, is developed to solve the sparse codes of SSR. SSR combines merits of both sides: its sparse codes reduce dimensionality of data meanwhile revealing cluster structure. For its inherent relation to cluster analysis, the codes of SSR can be directly used for clustering. Scut, a clustering approach derived from SSR reaches the state-of-the-art performance in the spectral clustering family. The one-shot solution obtained by Scut is comparable to the optimal result of K-means that are run many times. Experiments on various data sets demonstrate the properties and strengths of SSR, NSCrt, and Scut.

Fri Sep 29 2017
Machine Learning
A Nonlinear Orthogonal Non-Negative Matrix Factorization Approach to Subspace Clustering
The proposed NMF-based approach to subspace clustering takes into account the nonlinear nature of the manifold, as well as its intrinsic local geometry. The proposed algorithm considerably improves the clustering performance when compared to the several recently proposed state-of-the-art methods.
0
0
0
Fri Mar 07 2014
Computer Vision
Subspace clustering using a symmetric low-rank representation
A low-rank representation with symmetric constraint (LRRSC) is a method for robust subspace clustering. LRRSC extends the original low- Rank representation algorithm by integrating a Symmetric constraint. Experimental results on benchmark databases demonstrate effectiveness and robustness of L RRSC.
0
0
0
Wed Jan 09 2013
Machine Learning
Spectral Clustering Based on Local PCA
We propose a spectral clustering method based on local principal components (PCA) After performing local PCA in selected neighborhoods, the algorithm builds a nearest neighbor graph weighted according to a discrepancy between the principal subspaces in the neighborhoods. We evaluate our algorithm on various simulated data sets.
0
0
0
Fri May 18 2018
Machine Learning
Spectral feature scaling method for supervised dimensionality reduction
Spectral dimensionality reduction methods enable linear separations of complex data with high-dimensional features in a reduced space. These methods do not always give the desired results due to irregularities or uncertainties of the data. We consider aggressively modifying the scales of the features to obtain the desired classification.
0
0
0
Thu Mar 05 2015
Machine Learning
Scalable Iterative Algorithm for Robust Subspace Clustering
Subspace clustering (SC) is a popular method for dimensionality reduction of high-dimensional data. We develop a faster algorithm for SC incorporating robustness using a non-squared objective. The proposed algorithm monotonically converges to a local minimum with approximation guarantees.
0
0
0
Fri Oct 31 2014
Computer Vision
Symmetric low-rank representation for subspace clustering
The SLRR method can reveal the membership of multiple subspaces through the self-expressiveness property of the data. Extensive experimental results show that it outperforms state-of-the-art subspace clustering algorithms.
0
0
0