Published on Tue Oct 08 2019

Identifying nonlinear dynamical systems with multiple time scales and long-range dependencies

Dominik Schmidt, Georgia Koppe, Zahra Monfared, Max Beutelspacher, Daniel Durstewitz

A main theoretical interest in biology and physics is to identify the nonlinear dynamical system that generated observed time series. Recurrent Neural Networks (RNNs) are powerful enough to approximate any underlying DS, but in their vanilla form suffer from the exploding vs. vanishing gradients problem.

0
0
0
Abstract

A main theoretical interest in biology and physics is to identify the nonlinear dynamical system (DS) that generated observed time series. Recurrent Neural Networks (RNNs) are, in principle, powerful enough to approximate any underlying DS, but in their vanilla form suffer from the exploding vs. vanishing gradients problem. Previous attempts to alleviate this problem resulted either in more complicated, mathematically less tractable RNN architectures, or strongly limited the dynamical expressiveness of the RNN. Here we address this issue by suggesting a simple regularization scheme for vanilla RNNs with ReLU activation which enables them to solve long-range dependency problems and express slow time scales, while retaining a simple mathematical structure which makes their DS properties partly analytically accessible. We prove two theorems that establish a tight connection between the regularized RNN dynamics and its gradients, illustrate on DS benchmarks that our regularization approach strongly eases the reconstruction of DS which harbor widely differing time scales, and show that our method is also en par with other long-range architectures like LSTMs on several tasks.

Tue Feb 26 2019
Machine Learning
AntisymmetricRNN: A Dynamical System View on Recurrent Neural Networks
Recurrent neural networks have gained widespread use in modeling sequential data. Learning long-term dependencies using these models remains difficult due to exploding or vanishing gradients. A special form of recurrent networks called the AntisymmetricRNN is proposed under this theoretical framework.
1
2
6
Tue Mar 09 2021
Machine Learning
UnICORNN: A recurrent model for learning very long time dependencies
The design of recurrent neural networks (RNNs) is very challenging on account of the exploding and vanishing gradient problem. To overcome this, we propose a structure preserving discretization of a Hamiltonian system of second-order ordinary differential equations.
2
1
6
Wed Oct 26 2016
Machine Learning
Recurrent switching linear dynamical systems
We can gain insight into these systems by decomposing data into segments that are each explained by simpler dynamic units. These "recurrent" switching linear dynamical systems provide further insight by discovering the conditions under which each unit is deployed, something that traditional SLDS models fail to do.
0
0
0
Mon Jun 08 2020
Machine Learning
Learning Long-Term Dependencies in Irregularly-Sampled Time Series
0
0
0
Fri Apr 09 2021
Machine Learning
DeepSITH: Efficient Learning via Decomposition of What and When Across Time Scales
0
0
0
Fri Dec 29 2017
Neural Networks
Recent Advances in Recurrent Neural Networks
Recurrent neural networks (RNNs) are capable of learning features and long-term dependencies from sequential and time-series data. A well-trained RNN can model any dynamical system. The fundamentals and recent advances are explained and the research challenges are introduced.
0
0
0