Ensemble Methods for Machine Learning
-
- $44.99
-
- $44.99
Publisher Description
Ensemble machine learning combines the power of multiple machine learning approaches, working together to deliver models that are highly performant and highly accurate.
Inside Ensemble Methods for Machine Learning you will find:
• Methods for classification, regression, and recommendations
• Sophisticated off-the-shelf ensemble implementations
• Random forests, boosting, and gradient boosting
• Feature engineering and ensemble diversity
• Interpretability and explainability for ensemble methods
Ensemble machine learning trains a diverse group of machine learning models to work together, aggregating their output to deliver richer results than a single model. Now in Ensemble Methods for Machine Learning you’ll discover core ensemble methods that have proven records in both data science competitions and real-world applications. Hands-on case studies show you how each algorithm works in production. By the time you're done, you'll know the benefits, limitations, and practical methods of applying ensemble machine learning to real-world data, and be ready to build more explainable ML systems.
About the Technology
Automatically compare, contrast, and blend the output from multiple models to squeeze the best results from your data. Ensemble machine learning applies a “wisdom of crowds” method that dodges the inaccuracies and limitations of a single model. By basing responses on multiple perspectives, this innovative approach can deliver robust predictions even without massive datasets.
About the Book
Ensemble Methods for Machine Learning teaches you practical techniques for applying multiple ML approaches simultaneously. Each chapter contains a unique case study that demonstrates a fully functional ensemble method, with examples including medical diagnosis, sentiment analysis, handwriting classification, and more. There’s no complex math or theory—you’ll learn in a visuals-first manner, with ample code for easy experimentation!
What’s Inside
• Bagging, boosting, and gradient boosting
• Methods for classification, regression, and retrieval
• Interpretability and explainability for ensemble methods
• Feature engineering and ensemble diversity
About the Reader
For Python programmers with machine learning experience.
About the Author
Gautam Kunapuli has over 15 years of experience in academia and the machine learning industry.
Table of Contents
PART 1 - THE BASICS OF ENSEMBLES
1 Ensemble methods: Hype or hallelujah?
PART 2 - ESSENTIAL ENSEMBLE METHODS
2 Homogeneous parallel ensembles: Bagging and random forests
3 Heterogeneous parallel ensembles: Combining strong learners
4 Sequential ensembles: Adaptive boosting
5 Sequential ensembles: Gradient boosting
6 Sequential ensembles: Newton boosting
PART 3 - ENSEMBLES IN THE WILD: ADAPTING ENSEMBLE METHODS TO YOUR DATA
7 Learning with continuous and count labels
8 Learning with categorical features
9 Explaining your ensembles