This text provides an introduction to hidden Markov models (HMMs) for the dynamical systems community. It is a valuable text for third or fourth year undergraduates studying engineering, mathematics, or science that includes work in probability, linear algebra and differential equations. The book presents algorithms for using HMMs, and it explains the derivation of those algorithms. It presents Kalman filtering as the extension to a continuous state space of a basic HMM algorithm. The book concludes with an application to biomedical signals. This text is distinctive for providing essential introductory material as well as presenting enough of the theory behind the basic algorithms so that the reader can use it as a guide to developing their own variants.
Cons: demands a bit too much on dynamics/ODE/PDE & numerical solutions side - if you have no more than a little training in it. In the end I made some effort and managed to grasp the stuff, but Fraser's commentary was virtually no meaningful help.
Pros: aided me to dissipate a few misconceptions of mine on how HMMs do in practice instantiate & simplify a nondeterministic model by adapting it to a class of cases where any direct measurement of a y(t) is impossibile as insufficient to specify any observation function G(x(t)) and tie it to some state space model, thus the observations do never practically affect the distribution over future states. In the last resort one may profit, above all, from the nice code in python and some basic algorithms. Very good parts are the descriptions of Viterbi's for the most likely state sequence modelling; expectation-maximization algorithms for parameter estimation and other discrete states HMMs addressed in chapter 2. Good the examples of real-life use as a classification task; possibly good on nonlinear nongaussian processes, but _sensibly_ harder to follow.