This post by Mike gives a feel for the content in our report on probabilistic programming by introducing the algorithms and technology that make probabilistic programming possible.
Probabilistic programming enables us to construct and fit Bayesian models in code. Bayesian inference is a principled way to draw conclusions from incomplete or imperfect data, by interpreting data in light of prior knowledge of probabilities.
The problem is that, until recently, the algorithms that make product and business problems tractable using Bayesian methods have been difficult to implement and computationally expensive to run.
Until recently, the main alternative to this naive approach was Markov Chain Monte Carlo sampling (of which Metropolis Hastings and Gibbs sampling are well-known examples). Articles then continues with explanation of Hamiltonian Monte Carlo and the No U-Turn Sampler, Variational Inference and Automatic Differentation.
[Read More]