Search icon
Arrow left icon
All Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Newsletters
Free Learning
Arrow right icon
Building Statistical Models in Python

You're reading from  Building Statistical Models in Python

Product type Book
Published in Aug 2023
Publisher Packt
ISBN-13 9781804614280
Pages 420 pages
Edition 1st Edition
Languages
Concepts
Authors (3):
Huy Hoang Nguyen Huy Hoang Nguyen
Profile icon Huy Hoang Nguyen
Paul N Adams Paul N Adams
Profile icon Paul N Adams
Stuart J Miller Stuart J Miller
Profile icon Stuart J Miller
View More author details

Table of Contents (22) Chapters

Preface 1. Part 1:Introduction to Statistics
2. Chapter 1: Sampling and Generalization 3. Chapter 2: Distributions of Data 4. Chapter 3: Hypothesis Testing 5. Chapter 4: Parametric Tests 6. Chapter 5: Non-Parametric Tests 7. Part 2:Regression Models
8. Chapter 6: Simple Linear Regression 9. Chapter 7: Multiple Linear Regression 10. Part 3:Classification Models
11. Chapter 8: Discrete Models 12. Chapter 9: Discriminant Analysis 13. Part 4:Time Series Models
14. Chapter 10: Introduction to Time Series 15. Chapter 11: ARIMA Models 16. Chapter 12: Multivariate Time Series 17. Part 5:Survival Analysis
18. Chapter 13: Time-to-Event Variables – An Introduction 19. Chapter 14: Survival Models 20. Index 21. Other Books You May Enjoy

Simple Linear Regression

In previous chapters, we worked with distributions of single variables. Now we will discuss the relationships between variables. In this chapter and the next chapter, we will investigate the relationship between two or more variables using linear regression. In this chapter, we will discuss simple linear regression within the framework of Ordinary Least Squares (OLS) regression. Simple linear regression is a very useful tool for estimating continuous values from two linearly related variables. We will provide an overview of the intuitions and calculations behind regression errors. Next, we will provide an overview of the pertinent assumptions of linear regression. After that, we will analyze the output summary of OLS in statsmodels, and finally, we will address the scenarios of serial correlation and model validation. As highlighted, our main topics in this chapter follow this framework:

  • Simple linear regression using OLS
  • Coefficients of correlation...

Simple linear regression using OLS

We will study one of the simplest machine learning models – simple linear regression. We will provide its overview within the context of OLS, where the objective is to minimize the sum of the square of errors. It is a straightforward concept related to a dependent variable (quantitative response) y and its independent variable x, where their relationship can be drawn as a straight line, approximately. Mathematically, a simple linear regression model can be written in the following form:

y = β 0 + β 1 x + ϵ

Here, β 0 is the intercept term and β 1 is the slope of the linear model. The error term is denoted as ϵ in the preceding linear model. We can see that in an ideal case where the error term is zero, β 0 represents the value of the dependent variable y at x = 0. Within the range of the independent variable x, β 1 represents the increase in the outcome y corresponding...

Coefficients of correlation and determination

In this section, we will discuss two related notions – coefficients of correlation and coefficients of determination.

Coefficients of correlation

A coefficient of correlation is a measure of the statistical linear relationship between two variables and can be computed using the following formula:

r =  1 _ n 1 Σ i=1 n (x i   x  _ s x )(y i   y  _ s y )

The reader can go here – https://shiny.rit.albany.edu/stat/corrsim/ – to simulate the correlation relationship between two variables.

Figure 6.3 – Simulated bivariate distribution

Figure 6.3 – Simulated bivariate distribution

By observing the scatter plots, we can see the direction and the strength of the linear relationship between the two variables and their outliers. If the direction is positive (r>0), then...

Required model assumptions

Like the parametric tests we discussed in Chapter 4, Parametric Tests, linear regression is a parametric method and requires certain assumptions to be met for the results to be valid. For linear regression, there are four assumptions:

  • A linear relationship between variables
  • The normality of the residuals
  • The homoscedasticity of the residuals
  • Independent samples

Let’s discuss each of these assumptions individually.

A linear relationship between the variables

When thinking about fitting a linear model to data, our first consideration should be whether the model is appropriate for the data. When working with two variables, the relationship between the variables should be assessed with a scatter plot. Let’s look at an example. Three scatter plots are shown in Figure 6.6. The data is plotted, and the actual function used to generate the data is drawn over the data points. The leftmost plot shows data exhibiting a...

Testing for significance and validating models

Up to this point in the chapter, we have discussed the concepts of the OLS approach to linear regression modeling; the coefficients in a linear model; the coefficients of correlation and determination; and the assumptions required for modeling with linear regression. We will now begin our discussion on testing for significance and model validation.

Testing for significance

To test for significance, let us load statsmodels macrodata data set so we can build a model that tests the relationship between real gross private domestic investment, realinv, and real private disposable income, realdpi:

import numpy as np
import pandas as pd
import seaborn as sns
import statsmodels.api as sm
import matplotlib.pyplot as plt
from statsmodels.nonparametric.smoothers_lowess import lowess
df = sm.datasets.macrodata.load().data

Least squares regression requires a constant coefficient in order to derive the intercept. In the least squares equation...

Summary

In this chapter, we discussed an overview of simple linear regression between one explanatory variable and one response variable. The topics we covered include the following:

  • The OLS method for simple linear regression
  • Coefficients of correlation and determination and their calculations and significance
  • The assumptions required for least squares regression
  • Methods of analysis for model and parameter significance
  • Model validation

We looked closely at the concept of the square of error and how the sum of squared errors is meaningful for building and validating linear regression models. Then, we walked through the four pertinent assumptions required to make linear regression a stable solution. After, we provided an overview of four diagnostic plots and their interpretations with respect to assessing the presence of various issues related to heteroscedasticity, linearity, outliers, and serial correlation. We then walked through an example of using...

lock icon The rest of the chapter is locked
You have been reading a chapter from
Building Statistical Models in Python
Published in: Aug 2023 Publisher: Packt ISBN-13: 9781804614280
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $15.99/month. Cancel anytime}