Published on Mon Nov 26 2018

Frequency Principle in Deep Learning with General Loss Functions and Its Potential Application

Zhi-Qin John Xu

Deep neural networks (DNNs) with common settings often capture target functions from low to high frequency. F-Principle can provide an understanding to the often observed good generalization ability of DNNs.

0
0
0
Abstract

Previous studies have shown that deep neural networks (DNNs) with common settings often capture target functions from low to high frequency, which is called Frequency Principle (F-Principle). It has also been shown that F-Principle can provide an understanding to the often observed good generalization ability of DNNs. However, previous studies focused on the loss function of mean square error, while various loss functions are used in practice. In this work, we show that the F-Principle holds for a general loss function (e.g., mean square error, cross entropy, etc.). In addition, DNN's F-Principle may be applied to develop numerical schemes for solving various problems which would benefit from a fast converging of low frequency. As an example of the potential usage of F-Principle, we apply DNN in solving differential equations, in which conventional methods (e.g., Jacobi method) is usually slow in solving problems due to the convergence from high to low frequency.

Tue May 25 2021
Machine Learning
An Upper Limit of Decaying Rate with Respect to Frequency in Deep Neural Network
Deep neural network (DNN) usually learns the target function from low to high frequency, which is called frequency principle or spectral bias. Below the upper limit of the decaying rate, the DNN interpolates the training data by a function with a certain regularity.
0
0
0
Mon Jan 04 2021
Machine Learning
Frequency Principle in Deep Learning Beyond Gradient-descent-based Training
Frequency Principle (F-Principle) sheds light on strength and the weakness of DNNs. Previous works examine the F-Princ Principle in gradient-descent-based training. It remains unclear whether gradient- Descent-Based training is a necessary condition for the F. Principle.
0
0
0
Thu Oct 15 2020
Machine Learning
On the exact computation of linear frequency principle dynamics and its generalization
Recent works show an intriguing phenomenon of Frequency Principle(F-Principle) that deep neural networks (DNNs) fit the target function from low to high frequency during the training. This paper provides insight into the training and generalization behavior of DNNs in complex tasks.
0
0
0
Sat Jan 19 2019
Machine Learning
Frequency Principle: Fourier Analysis Sheds Light on Deep Neural Networks
We study the training process of Deep Neural Networks (DNNs) from the Fourier analysis perspective. We demonstrate that DNNs often fit target functions from low to high frequencies on high-dimensional benchmark datasets such as MNIST/CIFAR10.
0
0
0
Fri Jun 21 2019
Machine Learning
Theory of the Frequency Principle for General Deep Neural Networks
Deep Neural Networks (DNNs) tend to learn a target function from low to high frequencies during the training. The F-Principle has been very useful in providing both qualitative and quantitative understandings of DNNs.
0
0
0
Thu May 20 2021
Machine Learning
Deep Kronecker neural networks: A general framework for neural networks with adaptive activation functions
Kronecker neural networks (KNNs) form a general framework for neural networks with adaptive activation functions. KNNs employ the Kronecker product, which provides an efficient way of constructing a very wide network while keeping the number of parameters low.
0
0
0