Search icon
Arrow left icon
All Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Newsletters
Free Learning
Arrow right icon
Bayesian Analysis with Python - Third Edition

You're reading from  Bayesian Analysis with Python - Third Edition

Product type Book
Published in Jan 2024
Publisher Packt
ISBN-13 9781805127161
Pages 394 pages
Edition 3rd Edition
Languages
Author (1):
Osvaldo Martin Osvaldo Martin
Profile icon Osvaldo Martin

Table of Contents (15) Chapters

Preface
1. Chapter 1 Thinking Probabilistically 2. Chapter 2 Programming Probabilistically 3. Chapter 3 Hierarchical Models 4. Chapter 4 Modeling with Lines 5. Chapter 5 Comparing Models 6. Chapter 6 Modeling with Bambi 7. Chapter 7 Mixture Models 8. Chapter 8 Gaussian Processes 9. Chapter 9 Bayesian Additive Regression Trees 10. Chapter 10 Inference Engines 11. Chapter 11 Where to Go Next 12. Bibliography
13. Other Books You May Enjoy
14. Index

1.10 Summary

We began our Bayesian journey with a very brief discussion of statistical modeling, probabilities, conditional probabilities, random variables, probability distributions and Bayes’ theorem. We then used the coin-flipping problem as an excuse to introduce basic aspects of Bayesian modeling and data analysis. We used this classic toy example to convey some of the most important ideas of Bayesian statistics, such as using probability distributions to build models and represent uncertainties. We tried to demystify the use of priors and put them on an equal footing with other elements that are part of the modeling process, such as the likelihood, or even more meta-questions, such as why we are trying to solve a particular problem in the first place.

We ended the chapter by discussing the interpretation and communication of the results of a Bayesian analysis. We assume there is a true distribution that in general is unknown (and in principle also unknowable), from which we get a finite sample, either by doing an experiment, a survey, an observation, or a simulation. To learn something from the true distribution, given that we have only observed a sample, we build a probabilistic model. A probabilistic model has two basic ingredients: a prior and a likelihood. Using the model and the sample, we perform Bayesian inference and obtain a posterior distribution; this distribution encapsulates all the information about a problem, given our model and data. From a Bayesian perspective, the posterior distribution is the main object of interest and everything else is derived from it, including predictions in the form of a posterior predictive distribution. As the posterior distribution (and any other derived quantity from it) is a consequence of the model and data, the usefulness of Bayesian inferences is restricted by the quality of models and data. Finally, we briefly summarized the main aspects of doing Bayesian data analysis. Throughout the rest of this book, we will revisit these ideas to absorb them and use them as the scaffold of more advanced concepts.

In the next chapter, we will introduce PyMC, which is a Python library for Bayesian modeling and probabilistic machine learning and will use more features from ArviZ, a Python library for the exploratory analysis of Bayesian models, and PreliZ a Python library for prior elicitation.

You have been reading a chapter from
Bayesian Analysis with Python - Third Edition
Published in: Jan 2024 Publisher: Packt ISBN-13: 9781805127161
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $15.99/month. Cancel anytime}