1. An augmented sequential MCMC procedure for particle based learning in dynamical systems.
- Author
-
Javvad ur Rehman, Muhammad, Dass, Sarat C, and Asirvadam, Vijanth S
- Subjects
- *
MARKOV chain Monte Carlo , *STATISTICAL sampling , *DYNAMICAL systems , *NONLINEAR statistical models , *NONLINEAR dynamical systems , *PARAMETERS (Statistics) , *INSTRUCTIONAL systems - Abstract
Highlights • Bayesian parameter learning for non-linear and non-Gaussian dynamical systems. • Impulsive noise, chaotic and numerically discretized systems are considered. • Applicable when sufficient statistics does not exist for such systems. • Overdispersion by introducing artificial parameter evolution is avoided. Abstract Dynamical systems elicited via state space models are systems that consist of two components: a state and a measurement equation model that evolve over time. This paper addresses Bayesian inference of unknown parameters, or parameter learning, of such systems. Particle-based parameter learning methods form a well-known class of procedures for obtaining inference in state space models where a collection of particles are used to represent the posterior distributions of parameters. However, particle-based learning procedures require the availability of sufficient statistics and tractable posterior distributions of parameters based on these statistics for sampling, which is not always the case in many situations. We address the problem of particle-based learning when sufficient statistics and tractable distributions for sampling are not available. An augmented sequential Markov Chain Monte Carlo (ASMCMC) algorithm is developed for obtaining the posterior distribution of unknown parameters. We provide three guiding examples of nonlinear dynamical systems for which sufficient statistics and tractable distributions for sampling are not available, and illustrate the proposed ASMCMC methodology on these examples based on simulated data. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF