Published on Wed Jul 20 2016

Doubly Accelerated Methods for Faster CCA and Generalized Eigendecomposition

Zeyuan Allen-Zhu, Yuanzhi Li

We study -GenEV, the problem of finding the top $k $ generalized eigenvectors. We propose algorithms to solve the two problems with running times linearly dependent on the input size and on .

0
0
0
Abstract

We study -GenEV, the problem of finding the top generalized eigenvectors, and -CCA, the problem of finding the top vectors in canonical-correlation analysis. We propose algorithms and to solve the two problems with running times linearly dependent on the input size and on . Furthermore, our algorithms are DOUBLY-ACCELERATED: our running times depend only on the square root of the matrix condition number, and on the square root of the eigengap. This is the first such result for both -GenEV or -CCA. We also provide the first gap-free results, which provide running times that depend on rather than the eigengap.

Wed Apr 13 2016
Machine Learning
Efficient Algorithms for Large-scale Generalized Eigenvector Computation and Canonical Correlation Analysis
This paper considers the problem of canonical-correlation analysis (CCA) and, more broadly, the generalized eigen vector problem for a pair of symmetric matrices. We provide simple iterative algorithms, with improved runtimes, for solving these problems.
0
0
0
Thu Oct 29 2015
Machine Learning
The Singular Value Decomposition, Applications and Beyond
singular value decomposition (SVD) is not only a classical theory inMatrix computation and analysis, but also is a powerful tool in machine learning and modern data analysis. In this tutorial we first study the basic notion of SVD and then show the central role of SVA in
0
0
0
Tue Nov 02 2010
Machine Learning
A Very Fast Algorithm for Matrix Factorization
High-dimensional data are prevalent across many application areas. Such data generate an ever-increasing demand for methods of dimension reduction. Our algorithm uses a gradient-based approach which can be used with an arbitrary loss function provided the latter is differentiable.
0
0
0
Wed Mar 20 2019
Machine Learning
Noisy Accelerated Power Method for Eigenproblems with Applications
This paper introduces an efficient algorithm for finding the dominant generalized eigenvectors of a pair of symmetric matrices. Unlike these classic techniques, our algorithm is designed to decompose the overall problem into a series of subproblems that only need to be solved approximately.
0
0
0
Sun Nov 13 2016
Machine Learning
Accelerated Variance Reduced Block Coordinate Descent
Algorithms with fast convergence, small number of data access, and low per-iteration complexity are particularly favorable in the big data era. Existing algorithms lack at least one of these qualities and thus are not efficient in handling such big data challenge.
0
0
0
Fri Jun 28 2019
Machine Learning
Tutorial: Complexity analysis of Singular Value Decomposition and its variants
We compared the regular Singular Value Decomposition (SVD), truncated SVD, Krylov method and Randomized PCA, in terms of time and space complexity. We also discussed the relationship between Principal Component Analysis and SVD.
0
0
0