Published on Sun Nov 25 2018

Recovery guarantees for polynomial approximation from dependent data with outliers

Lam Si Tung Ho, Hayden Schaeffer, Giang Tran, Rachel Ward

Learning non-linear systems from noisy, limited, and/or dependent data is an important task across various scientific fields. In this work, we study the problem of learning nonlinear functions from corrupted and dependent data. The learning problem is recast as a sparselyrobust linear regression problem.

0
0
0
Abstract

Learning non-linear systems from noisy, limited, and/or dependent data is an important task across various scientific fields including statistics, engineering, computer science, mathematics, and many more. In general, this learning task is ill-posed; however, additional information about the data's structure or on the behavior of the unknown function can make the task well-posed. In this work, we study the problem of learning nonlinear functions from corrupted and dependent data. The learning problem is recast as a sparse robust linear regression problem where we incorporate both the unknown coefficients and the corruptions in a basis pursuit framework. The main contribution of our paper is to provide a reconstruction guarantee for the associated -optimization problem where the sampling matrix is formed from dependent data. Specifically, we prove that the sampling matrix satisfies the null space property and the stable null space property, provided that the data is compact and satisfies a suitable concentration inequality. We show that our recovery results are applicable to various types of dependent data such as exponentially strongly -mixing data, geometrically -mixing data, and uniformly ergodic Markov chain. Our theoretical results are verified via several numerical simulations.

Fri Sep 16 2011
Machine Learning
High-dimensional regression with noisy and missing data: Provable guarantees with nonconvexity
Many applications involve noisy and/or missing data, possibly involving dependence, as well. We prove that a simple algorithm based on projected gradient descent will converge in polynomial time to a small neighborhood of the set of all global minimizers.
0
0
0
Mon Aug 08 2016
Machine Learning
Sampling Requirements and Accelerated Schemes for Sparse Linear Regression with Orthogonal Least-Squares
Accelerated Orthogonal Least-Squares (AOLS) algorithm improves performance of the well-known Orthogonal Least-squares algorithm. While OLS greedily selects columns that correspond to non-zero components of the sparse vector, AOLS employs a novel procedure that speeds up the search.
0
0
0
Wed Jan 15 2020
Machine Learning
Bridging Convex and Nonconvex Optimization in Robust PCA: Noise, Outliers, and Missing Data
This paper delivers improved theoretical guarantees for the convex programming approach in low-rank matrix estimation. When the unknown matrix is well-conditioned, and of constant rank, we demonstrate that a principled convex program achieves near-optimal statistical accuracy.
0
0
0
Fri Aug 22 2014
Machine Learning
Nonconvex Statistical Optimization: Minimax-Optimal Sparse PCA in Polynomial Time
Sparse principal component analysis (PCA) involves nonconvex optimization. To address this issue, one popular approach is convex relaxation. However, such an approach may produce suboptimal estimators due to the relaxation effect. To optimally estimate sparse principal subspace, we propose a two-stage computational framework.
0
0
0
Sat Nov 14 2015
Machine Learning
Sparse Nonlinear Regression: Parameter Estimation and Asymptotic Inference
We study parameter estimation and asymptotic inference for sparse nonlinear regression. We prove that every stationary point of the objective enjoys an optimal rate of convergence. We provide an efficient algorithm that provably converges to a stationary point.
0
0
0
Sat Feb 18 2012
Machine Learning
Robust computation of linear models by convex relaxation
The paper describes a convex optimization problem that can reliably fit a low-dimensional model to this type of data. This approach parameterizes linear subspaces using orthogonal projectors. The paper provides an efficient algorithm for solving the problem.
0
0
0