Filters¶
One of the principal functions of a dynamical systems inference engine is filtering, i.e., computation of the distribution \(p(x_t \mid y_{1:T}, \theta)\). In the computation of a filtering distribution, we also obtain estimates of the marginal likelihood, \(p(y_{1:T} | \theta)\), used for parameter inference/system identification. To tell dynestyx that a dynamical system should be processed via a filtering algorithm, we use the Filter class.
Filter
dataclass
¶
Bases: BaseLogFactorAdder
Performs Bayesian filtering to compute the filtering distribution \(p(x_t | y_{1:t})\) and the marginal likelihood \(\log p(y_{1:T})\).
A Filter object should be used as a context manager around a call to a model with a dsx.sample(...) statement
to condition a dynamical model on observations via a filtering algorithm. The filter
is selected and dispatched according to the filter_config argument, which adds the
marginal log-likelihood as a NumPyro factor, allowing for downstream parameter inference.
Examples:
>>> def model(obs_times=None, obs_values=None):
... dynamics = DynamicalModel(...)
... return dsx.sample("f", dynamics, obs_times=obs_times, obs_values=obs_values)
>>> def filtered_model(t, y):
... with Filter(filter_config=KFConfig()):
... return model(obs_times=t, obs_values=y)
What this does¶
Filtering is the recursive (potentially approximate) computation of the filtering distribution \(p(x_t \mid y_{1:t})\). It allows for the computation of the marginal likelihood:
which in turn can be used to compute the posterior distribution over the parameters \(p(\theta | y_{1:T})\).
Available Filter Configurations¶
There are several different filters available in dynestyx, each with their own strengths and weaknesses.
What filters are applicable to a given model depends heavily on any special structure of the model (for example, linear and/or Gaussian observations).
For a summary table of all config classes and when to use them, see
Available filter configurations.
Defaults¶
If filter_config=None, defaults are:
ContinuousTimeEnKFConfig()for continuous-time models, andEKFConfig(filter_source="cuthbert")for discrete-time models.
Notes
- If your latent state is discrete (an HMM), you must use
HMMConfig. - What gets recorded to the trace (means/covariances, particles/weights,
etc.) depends on
filter_config.record_*and the backend implementation.
Attributes:
| Name | Type | Description |
|---|---|---|
filter_config |
BaseFilterConfig | None
|
Selects the filtering algorithm and its hyperparameters.
If |