Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Save more on your purchases! discount-offer-chevron-icon
Savings automatically calculated. No voucher code required.
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Newsletter Hub
Free Learning
Arrow right icon
timer SALE ENDS IN
0 Days
:
00 Hours
:
00 Minutes
:
00 Seconds
Arrow up icon
GO TO TOP
Enhancing Deep Learning with Bayesian Inference

You're reading from   Enhancing Deep Learning with Bayesian Inference Create more powerful, robust deep learning systems with Bayesian deep learning in Python

Arrow left icon
Product type Paperback
Published in Jun 2023
Publisher Packt
ISBN-13 9781803246888
Length 386 pages
Edition 1st Edition
Languages
Arrow right icon
Authors (3):
Arrow left icon
Matt Benatan Matt Benatan
Author Profile Icon Matt Benatan
Matt Benatan
Jochem Gietema Jochem Gietema
Author Profile Icon Jochem Gietema
Jochem Gietema
Marian Schneider Marian Schneider
Author Profile Icon Marian Schneider
Marian Schneider
Arrow right icon
View More author details
Toc

Table of Contents (11) Chapters Close

Preface 1. Chapter 1: Bayesian Inference in the Age of Deep Learning 2. Chapter 2: Fundamentals of Bayesian Inference FREE CHAPTER 3. Chapter 3: Fundamentals of Deep Learning 4. Chapter 4: Introducing Bayesian Deep Learning 5. Chapter 5: Principled Approaches for Bayesian Deep Learning 6. Chapter 6: Using the Standard Toolbox for Bayesian Deep Learning 7. Chapter 7: Practical Considerations for Bayesian Deep Learning 8. Chapter 8: Applying Bayesian Deep Learning 9. Chapter 9: Next Steps in Bayesian Deep Learning 10. Why subscribe?

3.4 Understanding the problem with typical neural networks

The deep neural networks we discussed in previous sections are extremely powerful and, paired with appropriate training data, have enabled big strides in machine perception. In machine vision, convolutional neural networks enable us to classify images, locate objects in images, segment images into different segments or instances, and even to generate entirely novel images. In natural language processing, recurrent neural networks and transformers have allowed us to classify text, to recognize speech, to generate novel text or, as reviewed previously, to translate between two different languages.

However, these standard types of neural network models also have several limitations. In this section, we will explore some of these limitations. We will look at the following:

  • How the prediction scores of such neural network models can be overconfident

  • How such models can produce very confident predictions on OOD data

  • How tiny, imperceptible...

lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at €18.99/month. Cancel anytime
Modal Close icon
Modal Close icon