Published on Fri Aug 31 2018

Graph reduction with spectral and cut guarantees

Andreas Loukas

The paper focuses on coarsening---the most common type of graph reduction. Sufficient conditions are derived for a small graph to approximate a larger one in the sense of restricted similarity.

0
0
0
Abstract

Can one reduce the size of a graph without significantly altering its basic properties? The graph reduction problem is hereby approached from the perspective of restricted spectral approximation, a modification of the spectral similarity measure used for graph sparsification. This choice is motivated by the observation that restricted approximation carries strong spectral and cut guarantees, and that it implies approximation results for unsupervised learning problems relying on spectral embeddings. The paper then focuses on coarsening---the most common type of graph reduction. Sufficient conditions are derived for a small graph to approximate a larger one in the sense of restricted similarity. These findings give rise to nearly-linear algorithms that, compared to both standard and advanced graph reduction methods, find coarse graphs of improved quality, often by a large margin, without sacrificing speed.

Wed Feb 21 2018
Machine Learning
Spectrally approximating large graphs with smaller graphs
How does coarsening affect the spectrum of a general graph? We provide conditions such that the principal eigenvalues and eigenspaces of a coarsened and original graph Laplacian matrices are close.
0
0
0
Sat Nov 23 2019
Machine Learning
GRASPEL: Graph Spectral Learning at Scale
For the first time, we present a highly-scalable spectral approach (GRASPEL) for learning large graphs from data. By limiting the precision matrix to be a graph Laplacian, our approach aims to estimate ultra-sparse (tree-like) weighted graphs.
0
0
0
Thu Jan 21 2016
Machine Learning
Incremental Spectral Sparsification for Large-Scale Graph-Based Semi-Supervised Learning
The harmonic function solution performs well in many semi-supervised learning (SSL) tasks. It is known to scale poorly with the number of samples. Sparse-HFS is an efficient edge-sparsification algorithm for SSL.
0
0
0
Mon Apr 22 2019
Machine Learning
A Unified Framework for Structured Graph Learning via Spectral Constraints
Graph learning from data represents a canonical problem that has received substantial attention in the literature. Learning a graph with a specific structure is essential for interpretability and identification of the relationships among data. In general, structured graph learning is an NP-hard combinatorial problem.
0
0
0
Tue Jan 29 2019
Machine Learning
Approximating Spectral Clustering via Sampling: a Review
Spectral clustering refers to a family of unsupervised learning algorithms that compute a spectral embedding of the original data. We focus on methods that come with a theoretical control on the clustering performance and incorporate some form of sampling in their operation.
0
0
0
Fri Feb 05 2021
Machine Learning
Matrix Decomposition on Graphs: A Functional View
We propose a functional view of matrix decomposition problems on graphs. Our framework is based on the key idea that using a reduced basis to represent functions on the product space is sufficient to recover a low rank.
0
0
0