Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Save more on your purchases! discount-offer-chevron-icon
Savings automatically calculated. No voucher code required.
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Newsletter Hub
Free Learning
Arrow right icon
timer SALE ENDS IN
0 Days
:
00 Hours
:
00 Minutes
:
00 Seconds
Arrow up icon
GO TO TOP
Scikit-learn Cookbook

You're reading from   Scikit-learn Cookbook Over 80 recipes for machine learning in Python with scikit-learn

Arrow left icon
Product type Paperback
Published in Dec 2025
Last Updated in Sep 2025
Publisher Packt
ISBN-13 9781836644453
Length 414 pages
Edition 3rd Edition
Languages
Arrow right icon
Author (1):
Arrow left icon
John Sukup John Sukup
Author Profile Icon John Sukup
John Sukup
Arrow right icon
View More author details
Toc

Table of Contents (14) Chapters Close

1. Scikit-learn Cookbook, Third Edition: Over 80 recipes for machine learning in Python with scikit-learn
2. Chapter 1: Common Conventions and API Elements of scikit-learn FREE CHAPTER 3. Chapter 2: Pre-Model Workflow and Data Preprocessing 4. Chapter 3: Dimensionality Reduction Techniques 5. Chapter 4: Building Models with Distance Metrics and Nearest Neighbors 6. Chapter 5: Linear Models and Regularization 7. Chapter 6: Advanced Logistic Regression and Extensions 8. Chapter 7: Support Vector Machines and Kernel Methods 9. Chapter 8: Tree-Based Algorithms and Ensemble Methods 10. Chapter 9: Text Processing and Multiclass Classification 11. Chapter 10: Clustering Techniques 12. Chapter 11: Novelty and Outlier Detection 13. Chapter 12: Cross-Validation and Model Evaluation Techniques 14. Chapter 13: Deploying scikit-learn Models in Production

Random Forests and Bagging

While building a single decision tree model is intuitive, most real-world applications will only use them as part of an ensemble method due to some of their shortcomings – especially in overfitting. As the saying goes, “two heads (or trees in this case) are better than one!”

Cleverly named random forests are robust ensemble models that combine multiple decision trees to improve accuracy and reduce overfitting. They achieve this by employing a method known as bagging (Bootstrap Aggregating), where multiple trees are trained on different subsets of the data sampled with replacement. Each tree contributes a prediction vote, and the final prediction is based on the majority vote or average of predictions from all trees. Random forests excel in handling large datasets and complex feature interactions better than a single decision tree alone. This recipe will introduce you to ensemble methods.

Getting ready

We will utilize scikit-learn to demonstrate...

lock icon The rest of the chapter is locked
Visually different images
CONTINUE READING
83
Tech Concepts
36
Programming languages
73
Tech Tools
Icon Unlimited access to the largest independent learning library in tech of over 8,000 expert-authored tech books and videos.
Icon Innovative learning tools, including AI book assistants, code context explainers, and text-to-speech.
Icon 50+ new titles added per month and exclusive early access to books as they are being written.
Scikit-learn Cookbook
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $19.99/month. Cancel anytime
Modal Close icon
Modal Close icon