Published on Thu Aug 29 2019

Estimation of a function of low local dimensionality by deep neural networks

Michael Kohler, Adam Krzyzak, Sophie Langer

Deep neural networks (DNNs) achieve impressive results for complicated tasks like object detection on images and speech recognition. There is a strong interest in showing good theoretical properties of DNNs. The aim of this paper is to contribute to the current statistical theory of Dnns.

0
0
0
Abstract

Deep neural networks (DNNs) achieve impressive results for complicated tasks like object detection on images and speech recognition. Motivated by this practical success, there is now a strong interest in showing good theoretical properties of DNNs. To describe for which tasks DNNs perform well and when they fail, it is a key challenge to understand their performance. The aim of this paper is to contribute to the current statistical theory of DNNs. We apply DNNs on high dimensional data and we show that the least squares regression estimates using DNNs are able to achieve dimensionality reduction in case that the regression function has locally low dimensionality. Consequently, the rate of convergence of the estimate does not depend on its input dimension , but on its local dimension and the DNNs are able to circumvent the curse of dimensionality in case that is much smaller than . In our simulation study we provide numerical experiments to support our theoretical result and we compare our estimate with other conventional nonparametric regression estimates. The performance of our estimates is also validated in experiments with real data.

Wed May 29 2019
Machine Learning
Intrinsic dimension of data representations in deep neural networks
Deep neural networks progressively transform their inputs across multiple layers. What are the geometrical properties of the representations learned by these networks? We study the intrinsic dimensionality of data-representations.
1
0
0
Mon Feb 15 2016
Neural Networks
Efficient Representation of Low-Dimensional Manifolds using Deep Networks
Deep neural networks can efficiently extract the intrinsic, low-dimensional coordinates of data. The first two layers of a deep network can exactly embed points lying on a monotonic chain. Remarkably, the network can do this using an almost optimal number of parameters.
0
0
0
Mon Aug 05 2019
Machine Learning
Nonparametric Regression on Low-Dimensional Manifolds using Deep ReLU Networks : Function Approximation and Statistical Recovery
Real world data often exhibit low-dimensional geometric structures. Real world data can be viewed as samples near a low- dimensional manifold. This paper studies nonparametric regression of H\"{o}lder functions on low- dimension manifolds using deep ReLU networks.
0
0
0
Tue Nov 21 2017
Machine Learning
Sparse-Input Neural Networks for High-dimensional Nonparametric Regression and Classification
Neural networks are usually not the tool of choice for nonparametric high-dimensional problems. We propose fitting a neural network with a sparse group lasso penalty on the first-layer input weights. This results in a neural net that only uses a small subset of the original features.
0
0
0
Tue Jul 20 2021
Machine Learning
Estimation of a regression function on a manifold by fully connected deep neural networks
Estimation of a regression function from independent and identically distributed data is considered. The error with integration with respect to the distribution of the predictor variable is used as the error criterion.
0
0
0
Thu Nov 16 2017
Computer Vision
LDMNet: Low Dimensional Manifold Regularized Neural Networks
Deep neural networks have proved very successful on archetypal tasks for which large training sets are available. But when the training data are scarce, their performance suffers from overfitting. We propose a new framework, the Low-Dimensional-Manifold-regularized neural Network (LDMNet), which focuses on the geometry of both the input and output features.
0
0
0