Reading: AIAMA 14.5
We saw last time that exact inference is computationally very difficult. One important difference between logic and probability is that it makes sense to approximate a probability. This open up methods for approximating the posterior even for very complex networks. We will look at one of the basic approaches to this, sampling (variational approaches are the other), and two classic sampling algorithms likelihood-weighted and Gibbs sampling.
Questions you should be able to answer after reading are: