Published on Thu May 23 2019

Neural ODEs with stochastic vector field mixtures

Niall Twomey, Michał Kozłowski, Raúl Santos-Rodríguez

This paper introduces two other fundamental tasks to the set that baseline methods cannot solve. We also formalise several loss functions that encourage desirable properties on the trajectory paths.

0
0
0
Abstract

It was recently shown that neural ordinary differential equation models cannot solve fundamental and seemingly straightforward tasks even with high-capacity vector field representations. This paper introduces two other fundamental tasks to the set that baseline methods cannot solve, and proposes mixtures of stochastic vector fields as a model class that is capable of solving these essential problems. Dynamic vector field selection is of critical importance for our model, and our approach is to propagate component uncertainty over the integration interval with a technique based on forward filtering. We also formalise several loss functions that encourage desirable properties on the trajectory paths, and of particular interest are those that directly encourage fewer expected function evaluations. Experimentally, we demonstrate that our model class is capable of capturing the natural dynamics of human behaviour; a notoriously volatile application area. Baseline approaches cannot adequately model this problem.

Sun Dec 06 2020
Machine Learning
Estimating Vector Fields from Noisy Time Series
There has been a surge of recent interest in learning differential equation models from time series. But methods in this area typically cannot cope with highly noisy data. To deal with (i), we describe a neural network architecture. For (ii), we propose an alternating minimization scheme.
0
0
0
Thu May 16 2019
Machine Learning
Vector Field Neural Networks
This work begins by establishing a mathematical formalization between different geometrical interpretations of Neural Networks. From this starting point, a new interpretation is explored, using the idea of implicit vector fields moving data as particles in a flow. A new architecture, Vector Fields Neural Networks(VFNN),
0
0
0
Tue Jun 19 2018
Machine Learning
Neural Ordinary Differential Equations
We introduce a new family of deep neural network models. Instead of specifying a discrete sequence of hidden layers, we parameterize the derivative of the hidden state using a neural network. The output of the network is thencomputed using a black-box differential equation solver.
3
4
29
Sun Sep 20 2020
Neural Networks
TorchDyn: A Neural Differential Equations Library
Continuous-depth learning has recently emerged as a novel perspective on deep learning. It improves performance in tasks related to dynamical systems anddensity estimation. Unlocking the full potential of such models requires a different set of software tools. TorchDyn is a library dedicated to continuous-depth learning.
0
0
0
Thu Jun 10 2021
Machine Learning
Compositional Modeling of Nonlinear Dynamical Systems with ODE-based Random Features
Modelling phenomena present in highly nonlinear dynamical systems is a challenging task. We present a novel, domain-agnostic approach to tackling this problem. We use compositions of physics-informedrandom features, derived from ordinary differential equations.
1
0
2
Tue Mar 23 2021
Machine Learning
Neural ODE Processes
Neural ODE Processes (NDPs) are a new class of stochastic processes determined by a distribution over Neural ODEs. NDPs can successfully capture the dynamics of low-dimensional systems from just a few data points.
0
0
0