Reader small image

You're reading from  Applied Supervised Learning with Python

Product typeBook
Published inApr 2019
Reading LevelIntermediate
Publisher
ISBN-139781789954920
Edition1st Edition
Languages
Right arrow
Authors (2):
Benjamin Johnston
Benjamin Johnston
author image
Benjamin Johnston

Benjamin Johnston is a senior data scientist for one of the world's leading data-driven MedTech companies and is involved in the development of innovative digital solutions throughout the entire product development pathway, from problem definition to solution research and development, through to final deployment. He is currently completing his Ph.D. in machine learning, specializing in image processing and deep convolutional neural networks. He has more than 10 years of experience in medical device design and development, working in a variety of technical roles, and holds first-class honors bachelor's degrees in both engineering and medical science from the University of Sydney, Australia.
Read more about Benjamin Johnston

Ishita Mathur
Ishita Mathur
author image
Ishita Mathur

Ishita Mathur has worked as a data scientist for 2.5 years with product-based start-ups working with business concerns in various domains and formulating them as technical problems that can be solved using data and machine learning. Her current work at GO-JEK involves the end-to-end development of machine learning projects, by working as part of a product team on defining, prototyping, and implementing data science models within the product. She completed her masters' degree in high-performance computing with data science at the University of Edinburgh, UK, and her bachelor's degree with honors in physics at St. Stephen's College, Delhi.
Read more about Ishita Mathur

View More author details
Right arrow

Chapter 3. Regression Analysis

Note

Learning Objectives

By the end of this chapter, you will be able to:

  • Describe regression models and explain the difference between regression and classification problems

  • Explain the concept of gradient descent, how it is used in linear regression problems, and how it can be applied to other model architectures

  • Use linear regression to construct a linear model for data in an x-y plane

  • Evaluate the performance of linear models and use the evaluation to choose the best model

  • Use feature engineering to create dummy variables for constructing more complicated linear models

  • Construct time series regression models using autoregression

Note

This chapter covers regression problems and analysis, introducing us to linear regression as well as multiple linear regression, gradient descent, and autoregression.

Introduction


In the first two chapters, we were introduced to the concept of supervised machine learning in Python and the essential techniques required for loading, cleaning, exploring, and visualizing raw data sources. We discussed the criticality of the correlations between the specified inputs and desired output for the given problem, as well as how the initial data preparation process can sometimes take a lot of the time spent on the entire project.

In this chapter, we will delve into the model building process and will construct our first supervised machine learning solution using linear regression. So, let's get started.

Regression and Classification Problems


We discussed two distinct methods, supervised learning and unsupervised learning, in Chapter 1, Python Machine Learning Toolkit. Supervised learning problems aim to map input information to a known output value or label, but there are two further subcategories to consider. Both supervised and unsupervised learning problems can be further divided into regression or classification problems. Regression problems, which are the subject of this chapter, aim to predict or model continuous values, for example, predicting the temperature tomorrow in degrees Celsius or determining the location of a face within an image. In contrast, classification problems, rather than returning a continuous value, predict membership of one of a specified number of classes or categories. The example supervised learning problem in Chapter 1, Python Machine Learning Toolkit, where we wanted to determine or predict whether a wig was from the 1960s or 1980s, is a good example of...

Linear Regression


We will start our investigation into regression problems with the selection of a linear model. Linear models, while being a great first choice due to their intuitive nature, are also very powerful in their predictive power, assuming datasets contain some degree of linear or polynomial relationship between the input features and values. The intuitive nature of linear models often arises from the ability to view data as plotted on a graph and observe a trending pattern in the data with, say, the output (the y axis value for the data) trending positively or negatively with the input (x axis value). While often not presented as such, the fundamental components of linear regression models are also often learned during high school mathematics classes. You may recall that the equation of a straight line, or linear model, is defined as follows:

Figure 3.1: Equation of a straight line

Here, x is the input value and y is the corresponding output or predicted value. The parameters of...

Multiple Linear Regression


We have already covered regular linear regression, as well as linear regression with polynomial terms, and considered training them with both the least squares method and gradient descent. This section of the chapter will consider an additional type of linear regression: multiple linear regression, where more than one type of variable (or feature) is used to construct the model. To examine multiple linear regression, we will use a modified version of the Boston Housing Dataset, available from https://archive.ics.uci.edu/ml/index.php. The modified dataset can be found in the accompanying source code or on GitHub at https://github.com/TrainingByPackt/Supervised-Learning-with-Python and has been reformatted for simplified use. This dataset contains a list of different attributes for property in the Boston area, including the crime rate per capita by town, the percentage of the population with a lower socio-economic status, as well as the average number of rooms per...

Autoregression Models


Autoregression models are part of a more classical statistical modeling technique that is used on time series data (that is, any dataset that changes with time) and extends upon the linear regression techniques covered in this chapter. Autoregression models are commonly used in the economics and finance industry as they are particularly powerful in time series datasets with a sizeable number of measurements. To reflect this, we will change our dataset to the S&P daily closing prices from 1986 to 2018, which is available in the accompanying source code.

Figure 3.59: S&P 500 Daily Closing Price

The main principle behind autoregression models is that, given enough previous observations, a reasonable prediction for the future can be made; that is, we are essentially constructing a model using the dataset as a regression against itself, hence autoregression. This relationship can be modeled mathematically as a linear equation:

Figure 3.60: First-order autoregression...

Summary


In this chapter, we took our first big leap into constructing machine learning models and making predictions with labeled datasets. We began our analysis by looking at a variety of different ways to construct linear models, starting with the precise least squares method, which is very good when modeling small amounts of data that can be processed using the available computer memory. The performance of our vanilla linear model was improved using dummy variables, which we created from categorical variables, adding additional features and context to the model. We then used linear regression analysis with a parabolic model to further improve performance, fitting a more natural curve to the dataset. We also implemented the gradient descent algorithm, which we noticed, while not as precise as the least squares method was for our limited dataset, was most powerful when the dataset cannot be processed on the resources available on the system.

Finally, we investigated the use of autoregression...

lock icon
The rest of the chapter is locked
You have been reading a chapter from
Applied Supervised Learning with Python
Published in: Apr 2019Publisher: ISBN-13: 9781789954920
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
undefined
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $15.99/month. Cancel anytime

Authors (2)

author image
Benjamin Johnston

Benjamin Johnston is a senior data scientist for one of the world's leading data-driven MedTech companies and is involved in the development of innovative digital solutions throughout the entire product development pathway, from problem definition to solution research and development, through to final deployment. He is currently completing his Ph.D. in machine learning, specializing in image processing and deep convolutional neural networks. He has more than 10 years of experience in medical device design and development, working in a variety of technical roles, and holds first-class honors bachelor's degrees in both engineering and medical science from the University of Sydney, Australia.
Read more about Benjamin Johnston

author image
Ishita Mathur

Ishita Mathur has worked as a data scientist for 2.5 years with product-based start-ups working with business concerns in various domains and formulating them as technical problems that can be solved using data and machine learning. Her current work at GO-JEK involves the end-to-end development of machine learning projects, by working as part of a product team on defining, prototyping, and implementing data science models within the product. She completed her masters' degree in high-performance computing with data science at the University of Edinburgh, UK, and her bachelor's degree with honors in physics at St. Stephen's College, Delhi.
Read more about Ishita Mathur