A Gentle Introduction to Dynestyx¶
This multi-part tutorial is a conceptual complement to the Quickstart. We start from basic Bayesian inference in NumPyro, then show how dynestyx extends it to sample and infer dynamical systems—with a clear separation between what the model is (parameters + dynamics + observations) and how we simulate or compute likelihoods (simulators and filters).
Contents¶
Part 1: NumPyro and the Bayesian workflow — Linear regression with NUTS and ArviZ; generating data with
Predictive; prior and posterior predictive (Bayesian workflow).Part 2: Dynestyx — discrete-time dynamical systems — What dynestyx is; separation of concerns; a first model (discrete-time stochastic volatility); context and handlers; generating data with
DiscreteTimeSimulator; NUTS + simulator for $p(x, \theta \mid \text{data})$.Part 3: Filtering and the marginal log-likelihood — Computing MLL with the particle filter at $\theta_{\text{mean}}$ and $\theta_{\text{true}}$; switching to Taylor KF; canonical reference placeholder.
Part 4: Filtering + NUTS — pseudomarginal inference — Using the filter’s MLL inside NUTS to sample only parameters; when this is “overkill” vs. valuable (with links to be filled).
Part 5: SVI and warming up NUTS — SVI as a fast alternative; using SVI to initialize NUTS; execution times; rapid prototyping.
Part 6a: Continuous-time dynamical systems (SDEs) —
ContinuousTimeStateEvolution(drift, diffusion coefficient, diffusion covariance);SDESimulator; Lorenz 63 with partial observations;LinearGaussianObservationand $H$; NUTS + EnKF; full-observations deep dive link.Part 6b: Continuous-time dynamical systems (ODEs) -
ContinuousTimeStateEvolution(drift and no diffusions);ODESimulator. Probabilistic numerics coming soon!Part 7: Hidden Markov Models (HMMs) — Working with categorical state spaces models (filtering and Bayesian parameter estimation).