Outline

Lecturer: Stéphane Derrode, Prof. at Ecole Centrale de Lyon, Dpt Mathématiques et Informatique.

The objective of the course is to present some hidden Markov models for the analysis of time series data. In particular, we will focus on the optimal classification of data in a Bayesian decision context, and on optimal filtering.

1- Classification: Hidden Markov chain (HMC) and beyond

This first part will focus on hidden Markov models with discrete states and all algorithms for time series classification: Bayesian decision, Baum’s forward-backward procedure, Viterbi algorithm, parameter learning by “Expectation-Maximization” principle.

Then, to make the link with the current state of research on these models, I intend to introduce the recent models called “pairwise and triplet Markov chains”, which are strict extensions of the HMC model, and for which the previous algorithms can be extended.

2- Filtering: From Kalman to particle filters

The second part will focus on extending this Markovian perspective for filtering and smoothing time series, starting with Gaussian dynamic linear systems (Kalman filter/smoother). I will also discuss the different algorithms in the literature to solve the robustness problems related to the calculation of covariance matrices.

Then, I intend to introduce solutions allowing to work with non-linear and/or non-Gaussian dynamic systems. The course will present the principle of MCMC (Markov Chain Monte-Carlo) methods, and in particular the particle filtering technique. Depending on the progress of the course, I may be able to build on work I have recently done on some extensions of these models (e. g. jump Markov models).

References

  • P. A Devijver, Baum’s forward-backward algorithm revisited, Pattern Recognition Letters, Vol. 3(6), pp. 369-373, 1985.
  • L. R. Rabiner, A tutorial on hidden Markov models and selected applications in speech recognition, Proceedings of the IEEE, pages 257-286, 1989.
  • L. E. Baum, T. Petrie, G. Soules, and N. Weiss. A Maximization Technique Occurring in the Statistical Analysis of Probabilistic Functions of Markov Chains, The Annals of Mathematical Statistics, Vol. 41(1), pp. 164-71, 1970.
  • S. Derrode, and W. Pieczynski, Signal and image segmentation using pairwise Markov chains, IEEE Trans. on Signal Processing, Vol. 52(9), pp. 2477-2489, 2004.
  • F. Desbouvries, and W. Pieczynski, Triplet Markov Models and Kalman filtering, Comptes Rendus de l’Académie des Sciences – Mathématique, Série I, Vol. 336(8), pp. 667-670, 2003.
  • O. Cappé, E. Moulines and T. Rydén, Inference in Hidden Markov Models, Springer, 2005.
  • A. Doucet, and A. Johansen, A Tutorial on Particle Filtering and Smoothing: Fifteen Years Later, Eds. Oxford Univ. Press, London, U.K., 2011.
  • I. Gorynin, S. Derrode, E. Monfrini, and W. Pieczynski, Fast smoothing in switching approximations of non-linear and non-Gaussian models, Computational Statistics and Data Analysis, Vol. 114, pp. 38–46, 2017.

Prerequisites

Ideally the students need to have working knowledge of basic probability and basic knowledge on linear algebra and calculus courses. We will anyway provide some notes during the course to refresh these notions.