# Building Probabilistic Graphical Models with Python

Formats:

save 15%!

save 38%!

**Free Shipping!**

Also available on: |

- Stretch the limits of machine learning by learning how graphical models provide an insight on particular problems, especially in high dimension areas such as image processing and NLP
- Solve real-world problems using Python libraries to run inferences using graphical models
- A practical, step-by-step guide that introduces readers to representation, inference, and learning using Python libraries best suited to each task

### Book Details

**Language :**English

**Paperback :**172 pages [ 235mm x 191mm ]

**Release Date :**June 2014

**ISBN :**1783289007

**ISBN 13 :**9781783289004

**Author(s) :**Kiran R Karkera

**Topics and Technologies :**All Books, Other, Open Source

## Table of Contents

PrefaceChapter 1: Probability

Chapter 2: Directed Graphical Models

Chapter 3: Undirected Graphical Models

Chapter 4: Structure Learning

Chapter 5: Parameter Learning

Chapter 6: Exact Inference Using Graphical Models

Chapter 7: Approximate Inference Methods

Appendix: References

Index

- Chapter 1: Probability
- The theory of probability
- Goals of probabilistic inference
- Conditional probability
- The chain rule
- The Bayes rule
- Interpretations of probability
- Random variables
- Marginal distribution
- Joint distribution
- Independence
- Conditional independence
- Types of queries
- Probability queries
- MAP queries

- Summary

- Chapter 2: Directed Graphical Models
- Graph terminology
- Python digression

- Independence and independent parameters
- The Bayes network
- The chain rule

- Reasoning patterns
- Causal reasoning
- Evidential reasoning
- Inter-causal reasoning

- D-separation
- The D-separation example
- Blocking and unblocking a V-structure

- Factorization and I-maps
- The Naive Bayes model
- The Naive Bayes example

- Summary

- Graph terminology

- Chapter 3: Undirected Graphical Models
- Pairwise Markov networks
- The Gibbs distribution
- An induced Markov network
- Factorization
- Flow of influence
- Active trail and separation
- Structured prediction
- Problem of correlated features
- The CRF representation
- The CRF example

- The factorization-independence tango
- Summary

- Chapter 4: Structure Learning
- The structure learning landscape
- Constraint-based structure learning
- Part I
- Part II
- Part III
- Summary of constraint-based approaches

- Score-based learning
- The likelihood score
- The Bayesian information criterion score
- The Bayesian score
- Summary of score-based learning

- Summary

- Chapter 5: Parameter Learning
- The likelihood function
- Parameter learning example using MLE
- MLE for Bayesian networks
- Bayesian parameter learning example using MLE
- Data fragmentation
- Effects of data fragmentation on parameter estimation
- Bayesian parameter estimation
- An example of Bayesian methods for parameter learning

- Bayesian estimation for the Bayesian network
- Example of Bayesian estimation
- Summary

- Chapter 6: Exact Inference Using Graphical Models
- Complexity of inference
- Real-world issues

- Using the Variable Elimination algorithm
- Marginalizing factors that are not relevant
- Factor reduction to filter evidence
- Shortcomings of the brute-force approach
- Using the Variable Elimination approach

- Complexity of Variable Elimination
- Graph perspective
- Learning the induced width from the graph structure

- The tree algorithm
- The four stages of the junction tree algorithm
- Using the junction tree algorithm for inference
- Stage 1.1 – moralization
- Stage 1.2 – triangulation
- Stage 1.3 – building the join tree
- Stage 2 – initializing potentials
- Stage 3 – message passing

- Summary

- Complexity of inference

- Chapter 7: Approximate Inference Methods
- The optimization perspective
- Belief propagation in general graphs
- Creating a cluster graph to run LBP
- Message passing in LBP

- Steps in the LBP algorithm
- Improving the convergence of LBP
- Applying LBP to segment an image
- Understanding energy-based models
- Visualizing unary and pairwise factors on a 3 x 3 grid
- Creating a model for image segmentation

- Applications of LBP

- Sampling-based methods
- Forward sampling
- The accept-reject sampling method
- The Markov Chain Monte Carlo sampling process
- The Markov property
- The Markov chain
- Reaching a steady state
- Sampling using a Markov chain

- Gibbs sampling
- Steps in the Gibbs sampling procedure
- An example of Gibbs sampling

- Summary

- The optimization perspective

### Kiran R Karkera

### Code Downloads

Download the code and support files for this book.

### Submit Errata

Please let us know if you have found any errors not listed on this list by completing our errata submission form. Our editors will check them and add them to this list. Thank you.

### Sample chapters

You can view our sample chapters and prefaces of this title on PacktLib or download sample chapters in PDF format.

- Create Bayesian networks and make inferences
- Learn the structure of causal Bayesian networks from data
- Gain an insight on algorithms that run inference
- Explore parameter estimation in Bayes nets with PyMC sampling
- Understand the complexity of running inference algorithms in Bayes networks
- Discover why graphical models can trump powerful classifiers in certain problems

With the increasing prominence in machine learning and data science applications, probabilistic graphical models are a new tool that machine learning users can use to discover and analyze structures in complex problems. The variety of tools and algorithms under the PGM framework extend to many domains such as natural language processing, speech processing, image processing, and disease diagnosis.

You've probably heard of graphical models before, and you're keen to try out new landscapes in the machine learning area. This book gives you enough background information to get started on graphical models, while keeping the math to a minimum.

This is a short, practical guide that allows data scientists to understand the concepts of Graphical models and enables them to try them out using small Python code snippets, without being too mathematically complicated.

If you are a data scientist who knows about machine learning and want to enhance your knowledge of graphical models, such as Bayes network, in order to use them to solve real-world problems using Python libraries, this book is for you.This book is intended for those who have some Python and machine learning experience, or are exploring the machine learning field.