Search icon
Arrow left icon
All Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Newsletters
Free Learning
Arrow right icon
Building Statistical Models in Python

You're reading from  Building Statistical Models in Python

Product type Book
Published in Aug 2023
Publisher Packt
ISBN-13 9781804614280
Pages 420 pages
Edition 1st Edition
Languages
Concepts
Authors (3):
Huy Hoang Nguyen Huy Hoang Nguyen
Profile icon Huy Hoang Nguyen
Paul N Adams Paul N Adams
Profile icon Paul N Adams
Stuart J Miller Stuart J Miller
Profile icon Stuart J Miller
View More author details

Table of Contents (22) Chapters

Preface 1. Part 1:Introduction to Statistics
2. Chapter 1: Sampling and Generalization 3. Chapter 2: Distributions of Data 4. Chapter 3: Hypothesis Testing 5. Chapter 4: Parametric Tests 6. Chapter 5: Non-Parametric Tests 7. Part 2:Regression Models
8. Chapter 6: Simple Linear Regression 9. Chapter 7: Multiple Linear Regression 10. Part 3:Classification Models
11. Chapter 8: Discrete Models 12. Chapter 9: Discriminant Analysis 13. Part 4:Time Series Models
14. Chapter 10: Introduction to Time Series 15. Chapter 11: ARIMA Models 16. Chapter 12: Multivariate Time Series 17. Part 5:Survival Analysis
18. Chapter 13: Time-to-Event Variables – An Introduction 19. Chapter 14: Survival Models 20. Index 21. Other Books You May Enjoy

Discrete Models

In the previous two chapters, we discussed models for predicting a continuous response variable. In this chapter, we will begin discussing models for predicting discrete response variables. We will start by discussing the probit and logit models for predicting binary outcome variables (categorical variables with two levels). Then, we will extend this idea to predicting categorical variables with multiple levels. Finally, we will look at predicting count variables, which are like categorical variables but only take values of integers and have an infinite number of levels.

In this chapter, we’re going to cover the following main topics:

  • Probit and logit models
  • Multinomial logit model
  • Poisson model
  • The negative binomial regression model

Probit and logit models

Previously, we discussed different types of problems that can be solved with regression models. In particular, the dependent variable is continuous, such as house prices, salaries, and so on. A natural question is if dependent variables are not continuous – in other words, if they are categorical – how would we adapt our regression equation to predict a categorical response variable? For instance, a human resources department in a company wants to conduct an attrition study to predict whether an employee will stay with the company or a car dealership wants to know if one car can be sold or not based on prices, car models, colors, and so on.

First, we will study binary classification. Here, the outcome (dependent variable) is a binary response such as yes/no or to do/not to do. Let’s look back at the simple linear regression model:

y = β 0 + β 1 x+ ϵ

Here, the predicted outcome is a line crossing data...

Multinomial logit model

In practice, there are many situations where the outcomes (dependent variables) are not binary but have more than two possibilities. Multinomial logistic regression can be understood as a general case of the logit model, which we studied in the previous section. In this section, we will consider a hands-on study on Iris data by using the MNLogit class from statsmodels: https://www.statsmodels.org/dev/generated/statsmodels.discrete.discrete_model.MNLogit.html.

Iris data (https://archive.ics.uci.edu/ml/datasets/iris) is one of the best-known statistical and machine learning datasets for education. The independent variables are sepal length (in cm), sepal width (in cm), petal length (in cm), and petal width (in cm). The dependent variable is a categorical variable with three levels: Iris Setosa (0), Iris Versicolor (1), and Iris Virginia (2). The following Python codes illustrate how to conduct this using sklearn and statsmodels:

# import packages
import...

Poisson model

In the previous section, we discussed models where the response variable was categorical. In this section, we will look at a model for count data. Count data is like categorical data (the categories are integers), but there are an infinite number of levels (0, 1, 2, 3, and so on). We model count data with the Poisson distribution. In this section, we will start by examining the Poisson distribution and its properties. Then, we will model a count variable with explanatory variables using the Poisson model.

The Poisson distribution

The Poisson distribution is given by the following formula:

P(k) =  λ k e λ _ k ! 

Here, λ is the average number of events and k is the number of events for which we would like the probability. P(k) is the probability that the k events occur. This distribution is used to calculate the probability of k events occurring in a fixed time interval or a defined space.

The shape...

The negative binomial regression model

Another useful approach to discrete regression is the log-linear negative binomial regression model, which uses the negative binomial probability distribution. At a high level, negative binomial regression is useful with over-dispersed count data where the conditional mean of the count is smaller than the conditional variance of the count. Model over-dispersion is where the variance of the target variable is greater than the mean assumed by the model. In a regression model, the mean is the regression line. We make the determination of using the negative binomial model based on target variable counts analysis (mean versus variance) and supply a measure of model over-dispersion to the negative binomial model to adjust for the over-dispersion, which we will discuss here.

It is important to note that the negative binomial model is not for modeling simply discrete data, but specifically count data associated with a fixed number of random trials...

Summary

In this chapter, we explained the issue of encountering negative raw probabilities that are generated by building a binary classification probability model based strictly on linear regression, where probabilities in a range of [0, 1] are expected. We provided an overview of the log-odds ratio and probit and logit modeling using the cumulative distribution function of both the standard normal distribution and logistic distribution, respectively. We also demonstrated methods for applying logistic regression to solve binary and multinomial classification problems. Lastly, we covered count-based regression using the log-linear Poisson and negative binomial models, which can also be logically extended to rate data without modification. We provided examples of their implementations.

In the following chapter, we will introduce conditional probability using Bayes’ theorem in addition to dimension reduction and classification modeling using linear discriminant analysis and...

lock icon The rest of the chapter is locked
You have been reading a chapter from
Building Statistical Models in Python
Published in: Aug 2023 Publisher: Packt ISBN-13: 9781804614280
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $15.99/month. Cancel anytime}