Search icon
Arrow left icon
All Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Newsletters
Free Learning
Arrow right icon
Practical Guide to Applied Conformal Prediction in Python

You're reading from  Practical Guide to Applied Conformal Prediction in Python

Product type Book
Published in Dec 2023
Publisher Packt
ISBN-13 9781805122760
Pages 240 pages
Edition 1st Edition
Languages
Author (1):
Valery Manokhin Valery Manokhin
Profile icon Valery Manokhin

Table of Contents (19) Chapters

Preface 1. Part 1: Introduction
2. Chapter 1: Introducing Conformal Prediction 3. Chapter 2: Overview of Conformal Prediction 4. Part 2: Conformal Prediction Framework
5. Chapter 3: Fundamentals of Conformal Prediction 6. Chapter 4: Validity and Efficiency of Conformal Prediction 7. Chapter 5: Types of Conformal Predictors 8. Part 3: Applications of Conformal Prediction
9. Chapter 6: Conformal Prediction for Classification 10. Chapter 7: Conformal Prediction for Regression 11. Chapter 8: Conformal Prediction for Time Series and Forecasting 12. Chapter 9: Conformal Prediction for Computer Vision 13. Chapter 10: Conformal Prediction for Natural Language Processing 14. Part 4: Advanced Topics
15. Chapter 11: Handling Imbalanced Data 16. Chapter 12: Multi-Class Conformal Prediction 17. Index 18. Other Books You May Enjoy

Conformal Prediction for Classification

This chapter dives deeper into the topic of conformal prediction for classification problems. We will explore the concept of classifier calibration and demonstrate how conformal prediction compares to other calibration methods before introducing Venn-ABERS predictors as specialized techniques within conformal prediction. Additionally, we will provide an overview of open source tools that can be utilized to implement conformal prediction for classifier calibration.

We will cover the following topics in this chapter:

  • Classifier calibration
  • Evaluating calibration performance
  • Various approaches to classifier calibration
  • Conformal prediction for classifier calibration
  • Open source tools for conformal prediction in classification problems

Classifier calibration

Most statistical, machine learning, and deep learning models output predicted class labels, and the models are typically evaluated in terms of their accuracy.

Accuracy is a prevalent measure for assessing the performance of a machine learning classification model. It quantifies the ratio of instances that are correctly identified to the overall count in the dataset. In other words, accuracy tells us how often the model’s predictions align with the true labels of the data.

The accuracy score measures how often the model’s predictions match the true observed labels. It is calculated as the fraction of correct predictions out of all predictions made. Accuracy scores between 0 and 1 quantify how accurate the model’s predictions are compared to the ground truth data. A higher accuracy score close to 1 signifies that the model is performing very accurately overall, with most of its predictions being correct. A lower accuracy approaching 0...

Evaluating calibration performance

Evaluating the calibration performance of a classifier is crucial to assessing the reliability and accuracy of its probability estimates. Calibration evaluation allows us to determine how well the predicted probabilities align with the true probabilities or likelihoods of the predicted events. Here are some commonly used techniques for evaluating the calibration performance of classifiers:

  • Calibration plot: A calibration plot visually assesses how well a classifier’s predicted probabilities match the true class frequencies. The x axis shows the predicted probabilities for each class, while the y axis shows the empirically observed frequencies for those predictions.

    For a well-calibrated model, the calibration curve should closely match the diagonal, representing a 1:1 relationship between predicted and actual probabilities. Deviations from the diagonal indicate miscalibration, where the predictions are inconsistent with empirical evidence...

Various approaches to classifier calibration

Before exploring how conformal prediction can provide calibrated probabilities, we will first discuss some common non-conformal calibration techniques and their strengths and weaknesses. These include histogram binning, Platt scaling, and isotonic regression.

It is important to note that the following methods are not part of the conformal prediction framework. We are covering them to build intuition about calibration and highlight some of the challenges with conventional calibration approaches. This background will motivate the need for and benefits of the conformal prediction perspective so that we can obtain reliable probability estimates.

The calibration techniques we will explore, including histogram binning, Platt scaling, and isotonic regression, represent widely used approaches for adjusting classifier confidence values. However, as we will discuss, they have certain limitations regarding model flexibility, computational expense...

Conformal prediction for classifier calibration

Conformal prediction is a powerful framework for probabilistic prediction that provides valid and well-calibrated prediction sets and prediction intervals. It offers a principled approach to quantify and control the uncertainty associated with the predictions.

We have already seen how conformal prediction approaches, such as inductive conformal prediction (ICP) and transductive conformal prediction (TCP), aim to generate sets that have accurate coverage probabilities. To recap, conformal prediction computes p-values and constructs prediction sets by comparing the p-values of each potential label with a selected significance level.

Unlike Platt scaling, histogram binning, and isotonic regression, which focus on calibrating the predicted probabilities or scores, conformal prediction takes a more comprehensive approach by providing prediction sets that encompass the uncertainty associated with the predictions and enhances the reliability...

Open source tools for conformal prediction in classification problems

While deep-diving into the intricacies of conformal prediction for classification, it has become evident that the right tools can significantly enhance our implementation efficiency. Recognizing this, the open source community has made remarkable contributions by providing various tools tailored for this purpose. In this section, we will explore some of the prominent open source tools for conformal prediction that can seamlessly integrate into your projects and elevate your predictive capabilities.

Nonconformist

nonconformist (https://github.com/donlnz/nonconformist) is a classical conformal prediction package that can be used for conformal prediction in classification problems.

Let’s illustrate how to create an ICP using nonconformist. You can find the Jupyter notebook containing the relevant code at https://github.com/PacktPublishing/Practical-Guide-to-Applied-Conformal-Prediction/blob/main/Chapter_06...

Summary

In this chapter, we embarked on an enlightening exploration of conformal prediction specifically tailored to classification tasks. We began by underscoring the significance of calibration in the realm of classification, emphasizing its role in ensuring the reliability and trustworthiness of model predictions. Through our journey, we were introduced to various calibration methods, including the various approaches to conformal prediction. We observed how conformal prediction uniquely addresses the challenges of calibration, providing both a theoretical and practical edge over traditional methods.

We also delved into the nuanced realms of Venn-ABERS predictors, shedding light on their roles and implications in the calibration process.

Lastly, we underscored the invaluable contribution of the open source community in this domain. We highlighted tools such as the nonconformist library, which serve as essential resources for practitioners who are keen on implementing conformal...

lock icon The rest of the chapter is locked
You have been reading a chapter from
Practical Guide to Applied Conformal Prediction in Python
Published in: Dec 2023 Publisher: Packt ISBN-13: 9781805122760
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at €14.99/month. Cancel anytime}