Skip to content

MCMC Configurations

MCMCInference is configured via MCMC config dataclasses. These specify sampler family, backend source, and algorithm hyperparameters.

BaseMCMCConfig dataclass

Shared configuration options inherited by all MCMC configs.

You do not instantiate this class directly; use one of the concrete subclasses (NUTSConfig, HMCConfig, SGLDConfig, MALAConfig, AdjustedMCLMCDynamicConfig).

Attributes:

Name Type Description
num_samples int

Number of post-warmup samples to return.

num_warmup int

Number of warmup/burn-in transitions.

num_chains int

Number of Markov chains to run in parallel.

mcmc_source MCMCSource

Backend library used for inference. Supported values are "numpyro" and "blackjax".

init_strategy callable

NumPyro initialization strategy used when constructing unconstrained initial parameters.

NUTSConfig dataclass

Bases: BaseMCMCConfig

No-U-Turn Sampler (NUTS) configuration.

HMCConfig dataclass

Bases: BaseMCMCConfig

Hamiltonian Monte Carlo (HMC) configuration.

Attributes:

Name Type Description
step_size float

Integrator step size used by the leapfrog solver.

num_steps int

Number of leapfrog steps per HMC proposal.

SGLDConfig dataclass

Bases: BaseMCMCConfig

Stochastic Gradient Langevin Dynamics (SGLD) configuration.

SGLD performs first-order Langevin updates using noisy gradients and injected Gaussian noise. In this implementation, gradients are computed on the full dataset (no minibatching), so the method behaves as full-batch Langevin dynamics with an annealed step schedule.

Attributes:

Name Type Description
step_size float

Base learning rate used in the SGLD schedule. This should generally be small.

schedule_power float

Power in the polynomial decay schedule \(\epsilon_t = \text{step_size} \cdot t^{-\text{schedule_power}}\). Values in (0.5, 1.0] are common for asymptotic convergence.