Published on Sun Apr 18 2021

Deep Clustering with Measure Propagation

Minhua Chen, Badrinath Jayakumar, Padmasundari Gopalakrishnan, Qiming Huang, Michael Johnston, Patrick Haffner
0
0
0
Abstract

Deep models have improved state-of-the-art for both supervised and unsupervised learning. For example, deep embedded clustering (DEC) has greatly improved the unsupervised clustering performance, by using stacked autoencoders for representation learning. However, one weakness of deep modeling is that the local neighborhood structure in the original space is not necessarily preserved in the latent space. To preserve local geometry, various methods have been proposed in the supervised and semi-supervised learning literature (e.g., spectral clustering and label propagation) using graph Laplacian regularization. In this paper, we combine the strength of deep representation learning with measure propagation (MP), a KL-divergence based graph regularization method originally used in the semi-supervised scenario. The main assumption of MP is that if two data points are close in the original space, they are likely to belong to the same class, measured by KL-divergence of class membership distribution. By taking the same assumption in the unsupervised learning scenario, we propose our Deep Embedded Clustering Aided by Measure Propagation (DECAMP) model. We evaluate DECAMP on short text clustering tasks. On three public datasets, DECAMP performs competitively with other state-of-the-art baselines, including baselines using additional data to generate word embeddings used in the clustering process. As an example, on the Stackoverflow dataset, DECAMP achieved a clustering accuracy of 79%, which is about 5% higher than all existing baselines. These empirical results suggest that DECAMP is a very effective method for unsupervised learning.

Thu Nov 19 2015
Machine Learning
Unsupervised Deep Embedding for Clustering Analysis
Deep Embedded Clustering (DEC) learns feature representations and cluster assignments using deep neural networks. DEC learns a mapping from the data space to a lower-dimensional feature space in which it iteratively optimizes a clustering objective.
0
0
0
Thu Jan 04 2018
Machine Learning
SpectralNet: Spectral Clustering using Deep Neural Networks
Spectral clustering is a leading and popular technique in unsupervised data analysis. Two of its major limitations are scalability and generalization of the spectral embedding. We introduce a deep learning approach to spectral clustering that overcomes the shortcomings.
0
0
0
Thu Jun 07 2018
Machine Learning
Semi-Supervised Learning via Compact Latent Space Clustering
We present a novel cost function for semi-supervised learning of neural networks. The key idea is to create a graph over embeddings of unlabeled samples of a training batch. We then devise a cost function based on Markov chains on the graph.
1
24
67
Wed Feb 13 2019
Machine Learning
Deep Divergence-Based Approach to Clustering
A promising direction in deep learning research consists in learning representations and discovering cluster structure in unlabeled data. We propose a novel loss function that incorporates geometric regularization constraints. The proposed network achieves competitive performance with respect to other state-of-the-art methods.
0
0
0
Tue Dec 15 2020
Artificial Intelligence
Objective-Based Hierarchical Clustering of Deep Embedding Vectors
The study includes datasets with up to million entries with embedding dimensions up to £2048$. The algorithm B2SAT&C achieves a -approximation for the popular Moseley-Wang / Cohen-Addad et al. objective.
0
0
0
Fri Dec 25 2020
Machine Learning
Learning Robust Representation for Clustering through Locality Preserving Variational Discriminative Network
Recent deep learning based methods focus on learning clustering oriented representations. VaDE suffers from two problems: 1) it is fragile to input noise; 2) it ignores the locality information between the neighboring data points. We propose a joint learning framework that improves VaDE.
0
0
0