Search icon
Arrow left icon
All Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Newsletters
Free Learning
Arrow right icon
Hands-On Markov Models with Python

You're reading from  Hands-On Markov Models with Python

Product type Book
Published in Sep 2018
Publisher Packt
ISBN-13 9781788625449
Pages 178 pages
Edition 1st Edition
Languages
Concepts
Authors (2):
Ankur Ankan Ankur Ankan
Profile icon Ankur Ankan
Abinash Panda Abinash Panda
Profile icon Abinash Panda
View More author details

2D HMM for Image Processing

In this chapter, we will introduce the application of HMM in the case of image segmentation. For image segmentation, we usually split up the given image into multiple blocks of equal size and then perform an estimation for each of these blocks. However, these algorithms usually ignore the contextual information from the neighboring blocks. To deal with that issue, 2D HMMs were introduced, which consider feature vectors to be dependent through an underlying 2D Markovian mesh. In this chapter, we will discuss how these 2D HMMs work and will derive parameter estimation algorithms for them. In this chapter, we will discuss the following topics:

  • Pseudo 2D HMMs
  • Introduction to 2D HMMs
  • Parameter learning in 2D HMMs
  • Applications

Recap of 1D HMM

Let's recap how 1D HMMs work, which we discussed in the previous chapters of this book. We have seen that HMM is a just a process over Markov chains. At any point in time, an HMM is in one of the possible states, and the next state that the model will transition to depends on the current state and the transition probability of the model.

Suppose that there are M = {1, 2, ..., M} possible states for HMM, and the transition probability of going from some state i to state j is given by ai,j. For such a model, if at time t-1 the model is at state i, then at time t it would be in state j with a probability of ai,j. This probability is known as the transition probability. Also, we have defined the observed variable in the model, which only depends on the current state of our hidden variable. We can define the observed variable at time t as ut, so let's say...

2D HMMs

A lot of work has been done regarding 2D HMMs, but the most recent work and well-received work has been done by Jia Li, Amir Najmi, and Robert Gray in their paper, Image Classification by a Two Dimensional Hidden Markov Model. This section has been written based on their work. We will start by giving the general algorithm they have introduced, and then, in further subsections, we will see how the algorithm works.

Algorithm

The algorithm for image classification is as follows:

  • Training:
    • Divide the training images into non-overlapping blocks with equal size and extract a feature vector for each block
    • Select the number of states for the 2D HMM
    • Estimate the model parameters based on the feature vectors and the training...

Summary

In this chapter we started with a short recap of 1D HMMs which we introduced in the previous chapter. Later we introduced the concepts of 2D HMMs and derived the various assumptions that we make for 2D HMMs to simplify our computations and it can be applied in image processing tasks. We then introduce a generic EM algorithm for learning the parameters in the case of 2D-HMMs.

In the next chapter, we will look at another application of HMMs in the field of reinforcement learning and will introduce MDP.

lock icon The rest of the chapter is locked
You have been reading a chapter from
Hands-On Markov Models with Python
Published in: Sep 2018 Publisher: Packt ISBN-13: 9781788625449
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $15.99/month. Cancel anytime}