Published on Sun Dec 06 2020

Fourier-domain Variational Formulation and Its Well-posedness for Supervised Learning

Tao Luo, Zheng Ma, Zhiwei Wang, Zhi-Qin John Xu, Yaoyu Zhang

Supervised learning problem is to find a function in a hypothesis function given values on isolated data points. Inspired by the frequency principle in neural networks, we propose a Fourier-domain variational formulation for the problem. This formulation circumvents the difficulty of imposing constraints on data.

0
0
0
Abstract

A supervised learning problem is to find a function in a hypothesis function space given values on isolated data points. Inspired by the frequency principle in neural networks, we propose a Fourier-domain variational formulation for supervised learning problem. This formulation circumvents the difficulty of imposing the constraints of given values on isolated data points in continuum modelling. Under a necessary and sufficient condition within our unified framework, we establish the well-posedness of the Fourier-domain variational problem, by showing a critical exponent depending on the data dimension. In practice, a neural network can be a convenient way to implement our formulation, which automatically satisfies the well-posedness condition.

Fri May 28 2021
Machine Learning
Galerkin Neural Networks: A Framework for Approximating Variational Equations with Error Control
0
0
0
Sat Sep 30 2017
Machine Learning
The Deep Ritz method: A deep learning-based numerical algorithm for solving variational problems
The Deep Ritz Method is naturally nonlinear, adaptive and has the potential to work in rather high dimensions. The framework is quite simple and fits well with the stochastic gradient descent method used in deep learning.
0
0
0
Tue May 07 2019
Machine Learning
Variational training of neural network approximations of solution maps for physical models
Solve-training framework uses the neural network as the ansatz of the solution map and train the network variationally via loss functions from the underlying physical models. Solve-trainingframework avoids expensive data preparation in the traditional supervised training procedure.
0
0
0
Mon Dec 30 2019
Machine Learning
Machine Learning from a Continuous Viewpoint
We present a continuous formulation of machine learning as a problem in the calculus of variations and differential-integral equations. We also present examples of new models, such as the flow-based random feature model, and new algorithms that arise naturally from this continuous formulation.
0
0
0
Wed Mar 11 2020
Neural Networks
hp-VPINNs: Variational Physics-Informed Neural Networks With Domain Decomposition
0
0
0
Thu May 27 2021
Machine Learning
Neural Network Training Using -Regularization and Bi-fidelity Data
Neural networks have become popular for surrogate modeling. As these networks are over-parameterized, training often requires a large amount of data. To prevent overfitting and improve generalization error, regularization based on the parameters of the network is applied.
2
0
0