Reader small image

You're reading from  Practical Guide to Applied Conformal Prediction in Python

Product typeBook
Published inDec 2023
PublisherPackt
ISBN-139781805122760
Edition1st Edition
Right arrow
Author (1)
Valery Manokhin
Valery Manokhin
author image
Valery Manokhin

Valeriy Manokhin is the leading expert in the field of machine learning and Conformal Prediction. He holds a Ph.D.in Machine Learning from Royal Holloway, University of London. His doctoral work was supervised by the creator of Conformal Prediction, Vladimir Vovk, and focused on developing new methods for quantifying uncertainty in machine learning models. Valeriy has published extensively in leading machine learning journals, and his Ph.D. dissertation 'Machine Learning for Probabilistic Prediction' is read by thousands of people across the world. He is also the creator of "Awesome Conformal Prediction," the most popular resource and GitHub repository for all things Conformal Prediction.
Read more about Valery Manokhin

Right arrow

Conformal Prediction for Computer Vision

In today’s fast-paced world, computer vision has grown beyond mere image recognition to be a fundamental cornerstone in numerous real-world applications. From self-driving cars navigating bustling streets to medical imaging systems that detect early signs of diseases, the demand for reliable and accurate computer vision models has never been higher. However, with the increasing complexity of these systems and their applications, a critical need arises for the ability to quantify the uncertainty associated with their predictions.

Enter conformal prediction, a ground-breaking framework that offers a robust means to encapsulate the uncertainty inherent in machine learning models. While traditional computer vision models often produce a singular prediction, the true power of conformal prediction lies in its ability to provide a set of possible outcomes, each backed by a confidence level. This offers practitioners a more informed, nuanced...

Uncertainty quantification for computer vision

As a domain, computer vision has transformed many sectors by automating complex tasks that were once reserved for human eyes and cognition. Computer vision models have become an integral part of modern technology, whether it’s detecting pedestrians on the road, identifying potential tumours in medical scans, or even analyzing satellite images for environmental studies. However, as the reliance on these models grows, so does the need to understand and quantify the uncertainty associated with their predictions.

Why does uncertainty matter?

Before deep-diving into the mechanics, it’s essential to understand why we need uncertainty quantification (UQ) in the first place. Let’s go through some of the reasons as follows:

  • Safety and reliability: A wrong prediction can have severe consequences in critical applications, such as medical imaging or autonomous driving. Knowing the confidence level in a prediction...

Why does deep learning produce miscalibrated predictions?

The ImageNet Large Scale Visual Recognition Challenge (ILSVRC) is an annual competition where research teams evaluate their algorithms on a given dataset, aiming to push the boundaries of computer vision. 2012 was a watershed moment for the field, marking a significant shift towards the dominance of deep learning in computer vision (https://www.image-net.org/challenges/LSVRC/2012/).

Before the advent of deep learning, computer vision primarily relied on hand-engineered features and traditional machine learning techniques. Algorithms such as Scale-Invariant Feature Transform (SIFT), Histogram of Oriented Gradients (HOG), and Speeded-Up Robust Features (SURF) were commonly used to extract features from images. These features would then be fed into machine learning classifiers such as Support Vector Machines (SVM) to make predictions. While these methods had their successes, they had significant limitations regarding scalability...

Various approaches to quantify uncertainty in computer vision problems

Uncertainty quantification in computer vision is crucial for ensuring vision-based systems’ reliability and safety, especially when deployed in critical applications. Over the years, various approaches have been developed to address and quantify this uncertainty. Here’s a look at some of the most prominent methods:

  • Bayesian Neural Networks (BNNs): These neural networks treat weights as probability distributions rather than fixed values. By doing so, they can provide a measure of uncertainty for their predictions. During inference, multiple forward passes are made with different weight samples, producing a distribution of outputs that capture the model’s uncertainty.
  • Monte Carlo dropout: Monte Carlo dropout involves performing dropout during inference. By running the network multiple times with dropout and averaging the results, a distribution over the outputs is obtained, which can...

The superiority of conformal prediction in uncertainty quantification

Quantifying uncertainty is fundamental to building robust and reliable machine learning models. Several methodologies have emerged over the years, each with its own merits. However, conformal prediction stands out as a particularly compelling framework. Let’s explain why:

  • Distribution-free framework: One of the most notable features of conformal prediction is that it doesn’t make any assumptions about the distribution of the data. Many uncertainty quantification methods are based on certain probabilistic assumptions or rely on specific data distributions to function effectively. In contrast, conformal prediction remains agnostic to these considerations, making it versatile and widely applicable across diverse datasets.
  • Theoretical guarantees: conformal prediction offers robust theoretical guarantees for its predictions. Specifically, it provides a set of potential outcomes for a prediction...

Conformal prediction for computer vision

In this section, we will dive deeper into the diverse applications of conformal prediction in computer vision. With its broad range of problems, from image classification to object detection, computer vision presents challenges that require precise and reliable machine learning models. As we navigate these applications, we will demonstrate how conformal prediction is a robust tool to quantify the uncertainty associated with these models.

By exploring these practical examples, we aim to underscore the importance of understanding the model’s confidence in its predictions. Understanding is crucial, especially when decisions based on these predictions could have significant consequences. Conformal prediction, with its ability to provide a measure of uncertainty, can greatly aid researchers and practitioners in making informed decisions based on the outputs of their models. This improves the system’s reliability and paves the way...

Building computer vision classifiers using conformal prediction

Let’s illustrate the application of conformal prediction to computer vision in practice. We will use a notebook from the book repository available at https://github.com/PacktPublishing/Practical-Guide-to-Applied-Conformal-Prediction/blob/main/Chapter_09.ipynb. This notebook extensively uses notebooks from Anastasios Angelopolous’ Conformal Prediction repo at https://github.com/aangelopoulos/conformal-prediction.

After loading the data, set up the problem and define the desired coverage and the number of points in the calibration set:

n_cal = 1000
alpha = 0.1

The softmax scores were split into the calibration and test datasets, obtaining calibration and test labels:

idx = np.array([1] * n_cal + [0] * (smx.shape[0]-n_cal)) > 0
np.random.seed(42)
np.random.shuffle(idx)
cal_smx, test_smx = smx[idx,:], smx[~idx,:]
cal_labels, test_labels = labels[idx], labels[~idx]

The test dataset contains 49...

Summary

In the rapidly evolving realm of technology, computer vision has transformed from mere image recognition into an integral component of countless real-world applications. As these applications span diverse fields such as autonomous vehicles and medical diagnostics, the pressure on computer vision models to deliver accurate and reliable predictions intensifies. With the growing sophistication of these models comes a dire need: quantifying prediction uncertainties.

This is where conformal prediction shines. Unlike traditional models that typically output a singular prediction, conformal prediction offers a range of potential outcomes, each coupled with a confidence measure. This novel approach grants users a detailed perspective on model predictions, which is invaluable for applications where precision is paramount.

This chapter delved into the symbiotic relationship between conformal prediction and computer vision. We started by emphasizing the importance of uncertainty...

lock icon
The rest of the chapter is locked
You have been reading a chapter from
Practical Guide to Applied Conformal Prediction in Python
Published in: Dec 2023Publisher: PacktISBN-13: 9781805122760
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
undefined
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $15.99/month. Cancel anytime

Author (1)

author image
Valery Manokhin

Valeriy Manokhin is the leading expert in the field of machine learning and Conformal Prediction. He holds a Ph.D.in Machine Learning from Royal Holloway, University of London. His doctoral work was supervised by the creator of Conformal Prediction, Vladimir Vovk, and focused on developing new methods for quantifying uncertainty in machine learning models. Valeriy has published extensively in leading machine learning journals, and his Ph.D. dissertation 'Machine Learning for Probabilistic Prediction' is read by thousands of people across the world. He is also the creator of "Awesome Conformal Prediction," the most popular resource and GitHub repository for all things Conformal Prediction.
Read more about Valery Manokhin