Reader small image

You're reading from  Deep Learning with MXNet Cookbook

Product typeBook
Published inDec 2023
Reading LevelBeginner
PublisherPackt
ISBN-139781800569607
Edition1st Edition
Languages
Tools
Right arrow
Author (1)
Andrés P. Torres
Andrés P. Torres
author image
Andrés P. Torres

Andrés P. Torres, is the Head of Perception at Oxa, a global leader in industrial autonomous vehicles, leading the design and development of State-Of The-Art algorithms for autonomous driving. Before, Andrés had a stint as an advisor and Head of AI at an early-stage content generation startup, Maekersuite, where he developed several AI-based algorithms for mobile phones and the web. Prior to this, Andrés was a Software Development Manager at Amazon Prime Air, developing software to optimize operations for autonomous drones.
Read more about Andrés P. Torres

Right arrow

Solving Classification Problems

In the previous chapters, we learned how to set up and run MXNet, how to work with Gluon and DataLoader, and how to visualize datasets for regression, classification, image, and text problems. We also discussed the different learning methodologies. In this chapter, we are going to focus on supervised learning with classification problems. We will learn why these problems are suitable for deep learning models with an overview of the equations that define these problems. We will learn how to create suitable models for them and how to train them, emphasizing the choice of hyperparameters. We will end each section by evaluating the models according to our data, as expected in supervised learning, and we will look at the different evaluation criteria for classification problems.

Specifically, we will cover the following recipes:

  • Understanding math for classification models
  • Defining loss functions and evaluation metrics for classification
  • ...

Technical requirements

Apart from the technical requirements specified in the Preface, the following technical requirements apply:

  • Ensure that you have completed the first recipe, Installing MXNet, Gluon, GluonCV and GluonNLP, from Chapter 1, Up and Running with MXNet.
  • Ensure that you have completed the second recipe, Toy dataset for classification – Loading, Managing, and Visualizing Iris Dataset, from Chapter 2, Working with MXNet and Visualizing Datasets: Gluon and DataLoader.
  • Most of the concepts for the model, the loss and evaluation functions, and the training were introduced in Chapter 3, Solving Regression Problems. Furthermore, as we will see in this chapter, classification can be seen as a special case of regression. Therefore, it is strongly recommended to complete Chapter 3 first.

The code for this chapter can be found at the following GitHub URL: https://github.com/PacktPublishing/Deep-Learning-with-MXNet-Cookbook/tree/main/ch04.

Furthermore...

Understanding math for classification models

As we saw in the previous chapter, classification problems are supervised learning problems whose output is a class from a set of classes (categorical assignments) – for example, the iris class of a flower.

As we will see throughout this recipe, classification models can be seen as individual cases of regression models. We will start by exploring a binary classification model. This is a model that will output one of two classes. We will label these classes [0, 1] for simplicity.

The simplest model we can use for such a binary classification problem is a linear regression model. This model will output a number; therefore, to modify the output to satisfy our new classification criteria, we will modify the activation function to a more suitable one.

As in the previous recipes, we will use a neural network as our model, and we will solve the iris dataset prediction problem we introduced in the second recipe, Toy dataset for classification...

Defining loss functions and evaluation metrics for classification

In the previous recipe, we defined our input features, described our model, and initialized it. At that point, we passed a features vector of a flower to predict its iris species, calculated the output, and compared it against the expected class.

We also showed how those preliminary results did not represent a proper evaluation. In this recipe, we will explore the topic of evaluating our classification models.

Furthermore, we will also understand which loss functions fit best for the binary and multi-label classification problem.

Getting ready

Loss functions and evaluation functions need to satisfy the same properties that are described in Chapter 3, Solving Regression Problems, in the second recipe, Defining Loss functions and evaluation metrics for regression; therefore, I recommend reading that chapter first for a more thorough understanding.

We will start developing our topics by analyzing the binary...

Training for classification models

In this recipe, we will visit the basic concepts of training a model to solve a classification problem. We will apply them to optimize the classification model we previously defined in this chapter, combined with the usage of the loss functions and evaluation metrics we discussed.

We will predict the iris class of flowers using the dataset seen in the second recipe, Toy dataset for classification – load, manage, and visualize Iris dataset, from Chapter 2, Working with MXNet and Visualizing Datasets: Gluon and DataLoader.

Getting ready

In this recipe, we will follow a similar pattern as we did in Chapter 3, Solving Regression Problems, in the third recipe, Training for regression models, so it will be interesting to revisit the concepts of the loss function, optimizer, dataset split, epochs, and batch size.

How to do it...

In this recipe, we will create our own training loop and we will evaluate how each hyperparameter influences...

Evaluating classification models

In the previous recipe, we learned how to choose our training hyperparameters to optimize our training. We also verified how those choices affected the training and validation losses. In this recipe, we are going to explore how those choices affect our actual evaluation in the real world. You will have noticed that we split the dataset into three different sets: training, validation, and test sets. However, during our training, we only used the training set and the validation set. In this recipe, we will emulate real-world behavior by using the unseen data from our model, the test set.

Getting ready

When evaluating a model, we can perform qualitative evaluation and quantitative evaluation.

Qualitative evaluation is the selection of one or more random (or not so random, depending on what we are looking for) samples and analyzing the result, verifying whether it matches our expectations.

In this recipe, we will compute the evaluation metrics...

lock icon
The rest of the chapter is locked
You have been reading a chapter from
Deep Learning with MXNet Cookbook
Published in: Dec 2023Publisher: PacktISBN-13: 9781800569607
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
undefined
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $15.99/month. Cancel anytime

Author (1)

author image
Andrés P. Torres

Andrés P. Torres, is the Head of Perception at Oxa, a global leader in industrial autonomous vehicles, leading the design and development of State-Of The-Art algorithms for autonomous driving. Before, Andrés had a stint as an advisor and Head of AI at an early-stage content generation startup, Maekersuite, where he developed several AI-based algorithms for mobile phones and the web. Prior to this, Andrés was a Software Development Manager at Amazon Prime Air, developing software to optimize operations for autonomous drones.
Read more about Andrés P. Torres