Published on Sun Jun 23 2019

Inferring Latent dimension of Linear Dynamical System with Minimum Description Length

Yang Li

Time-invariant linear dynamical system arises in many real-world applications. A practical limitation with this model is that its latent dimension that has a large impact on the capability needs to be manually specified.

0
0
0
Abstract

Time-invariant linear dynamical system arises in many real-world applications,and its usefulness is widely acknowledged. A practical limitation with this model is that its latent dimension that has a large impact on the model capability needs to be manually specified. It can be demonstrated that a lower-order model class could be totally nested into a higher-order class, and the corresponding likelihood is nondecreasing. Hence, criterion built on the likelihood is not appropriate for model selection. This paper addresses the issue and proposes a criterion for linear dynamical system based on the principle of minimum description length. The latent structure, which is omitted in previous work, is explicitly considered in this newly proposed criterion. Our work extends the principle of minimum description length and demonstrates its effectiveness in the tasks of model training. The experiments on both univariate and multivariate sequences confirm the good performance of our newly proposed method.

Wed Sep 25 2019
Machine Learning
The Dynamical Gaussian Process Latent Variable Model in the Longitudinal Scenario
Dynamical Gaussian Process Latent Variable Models provide an elegant non-parametric framework for learning the low dimensional representations of high-dimensional time-series. We demonstrate the use of this approach on synthetic as well as the human motion capture data set.
0
0
0
Tue Jun 16 2020
Machine Learning
Learning Dynamics Models with Stable Invariant Sets
Invariance and stability are essential notions in dynamical systems study. It is of great interest to learn a dynamics model with a stable invariant set. We propose defining such a set as a primitive shape in a latent space.
0
0
0
Wed Feb 21 2018
Machine Learning
Nonparametric Bayesian Sparse Graph Linear Dynamical Systems
A nonparametric Bayesian sparse graph linear dynamical system (SGLDS) is proposed to model sequentially observed multivariate data. SGLDS uses the Bernoulli-Poisson link together with a gamma process to generate an infinite dimensional sparse random graph to model state transitions.
0
0
0
Wed Aug 21 2019
Machine Learning
Data-driven model reduction, Wiener projections, and the Koopman-Mori-Zwanzig formalism
Model reduction methods aim to describe complex dynamic phenomena using onlyrelevant dynamical variables. In the absence of special dynamical features such as scale separation or symmetries, the time evolution of these variables typically exhibits memory effects.
0
0
0
Thu May 23 2019
Machine Learning
Unsupervised Discovery of Temporal Structure in Noisy Data with Dynamical Components Analysis
Linear dimensionality reduction methods are commonly used to extract low-dimensional structure from high-dimensional data. We introduce Dynamical Components Analysis(DCA) which discovers a subspace of high-dimensional time series data with maximal predictive information. DCA robustly extracts dynamical structure in noisy,
0
0
0
Mon Mar 01 2021
Machine Learning
Operator inference of non-Markovian terms for learning reduced models from partially observed state trajectories
This work introduces a non-intrusive model reduction approach for learning reduced models from partially observed state trajectories of high-dimensional dynamical systems. The proposed approach compensates for the loss of information due to the partially observed states by constructing non-Markovian Reduced models.
0
0
0