Currently pgmpy provides two sampling classes, A class of algorithms namely Forward sampling, Rejection Sampling and Likelihood weighted sampling which are specific to Bayesian Model( BayesianModel in pgmpy) and Gibbs Sampling a Markov Chain Monte Carlo Algorithm that generates samples from both Bayesian Network and Markov models. Since pgmpy is looking to support continuous random variables in coming time, we need classes of inference algorithms (sampling too) that can specifically work with these continuous random variables. This proposal deals with adding two more sampling algorithms in pgmpy, namely:
- Hamiltonian/Hybrid Monte Carlo (HMC): A Markov Chain Monte Carlo algorithm that adopts physical system dynamics rather than a probability distribution to propose future states in the Markov chain.
- No U Turn Sampler (NUTS): An extension of Hamiltonian Monte Carlo that does not require the number of steps L (a parameter that is crucial for good performance in case of HMC).