You wanted more of Kirill Eremenko, now you've got it! Kirill returns to the show today to detail Decision Trees, Random Forests and all three of the leading gradient-boosting algorithms: XGBoost, LightGBM and CatBoost 😸
If you don’t already know him, Kirill:
• Is Founder and CEO of SuperDataScience, an e-learning platform that is the namesake of this very podcast.
• Launched the SuperDataScience Podcast in 2016 and hosted the show until he passed me the reins four years ago.
• Has reached more than 2.7 million students through the courses he’s published on Udemy, making him Udemy’s most popular data science instructor.
Today’s episode is a highly technical one focused specifically on Gradient Boosting methods and the foundational theory required to understand them. I expect this episode will be of interest primarily to hands-on practitioners like data scientists, software developers and machine learning engineers.
In this episode, Kirill details:
• Decisions Trees.
• How Decision Trees are ensembled into Random Forests via Bootstrap Aggregation.
• How the AdaBoost algorithm formed a bridge from Random Forests to Gradient Boosting.
• How Gradient Boosting works for both regression and classification tasks.
• All three of the most popular Gradient Boosting approaches — XGBoost, LightGBM and CatBoost — as well as when you should choose them.
The SuperDataScience podcast is available on all major podcasting platforms, YouTube, and at SuperDataScience.com.