2.4 Summary
In this chapter, we’ve covered some of the fundamental concepts and methods related to Bayesian inference. First, we reviewed Bayes’ theorem and the fundamentals of probability theory – allowing us to understand the concept of uncertainty, as well as how we apply it to the predictions of ML models. Next, we introduced sampling, and an important class of algorithms: Markov Chain Monte Carlo, or MCMC, methods. Lastly, we covered Gaussian processes, and illustrated the crucial concept of well calibrated uncertainty. These key topics will provide you with the necessary foundation for the content that will follow, however, we encourage you to explore the recommended reading materials for a more comprehensive treatment of the topics introduced in this chapter.
In the next chapter, we will see how DNNs have changed the landscape of machine learning over the last decade, exploring the tremendous advantages offered by deep learning, and the motivation behind the development of BDL methods.