Data Science at Home

Data Science at Home


MCMC with full conditionals: A New Podcast Episode

March 02, 2016

Markov Chain MonteCarlo with full conditional calculations
At some point, statistical problems need sampling. Sampling consists in generating observations from a specific distribution.
Prior knowledge and likelihood are the essential components of bayesian theory to model a number of real-world problems. Therefore, bayesian statistics is based on the fundamental task of sampling from a distribution, as complicated as it can be, and then compute summary statistics like mean and variance in order to describe the observations that are representing the assumptions (the model).

In this episode, we learn how to do this.

In addition, I explain how Hamiltonian Montecarlo sampling works and why we should all use it, whenever we can.
Following the shownotes makes it easier to understand some mathematical formulas that is better to read than listen.

Enjoy the show!