9.6 Further reading
The following reading recommendations are provided for those who wish to learn more about the recent methods presented in this chapter. These give a great insight into current challenges in the field, looking beyond Bayesian neural networks and into scalable Bayesian inference more generally:
Deep Ensemble Bayesian Active Learning, Pop and Fulop: This paper demonstrates the advantages of combining deep ensembles with MC dropout to produce better-calibrated uncertainty estimates, as shown when applying their method to active learning tasks.
Uncertainty in Neural Networks: Approximately Bayesian Ensembling, Pearce et al.: This paper introduces a simple and effective method for improving the performance of deep ensembles. The authors show that by promoting diversity through a simple adaptation to the loss function, the ensemble is able to produces better-calibrated uncertainty estimates.
Sparse Gaussian Processes Using Pseudo-Inputs, Snelson and Gharamani: This paper...