In this video we dive into the nitty-gritty details of the math behind XGBoost trees. We derive the equations for the Output Values from the leaves as well as the Similarity Score. Then we show how these general equations are customized for Regression or Classification by their respective Loss Functions. If you make it to the end, you will be approximately 22% smarter than you are now! :)
NOTE: This StatQuest assumes that you are already familiar with...
XGBoost Part 1: XGBoost Trees for Regression: [ Ссылка ]
XGBoost Part 2: XGBoost Trees for Classification: [ Ссылка ]
Gradient Boost Part 1: Regression Main Ideas: [ Ссылка ]
Gradient Boost Part 2: Regression Details:[ Ссылка ]
Gradient Boost Part 3: Classification Main Ideas: [ Ссылка ]
Gradient Boost Part 4: Classification Details: [ Ссылка ]
...and Ridge Regression: [ Ссылка ]
Also note, this StatQuest is based on the following sources:
The original XGBoost manuscript: [ Ссылка ]
The original XGBoost presentation: [ Ссылка ]
And the XGBoost Documentation: [ Ссылка ]
Last but not least, I want to extend a special thanks to Giuseppe Fasanella and Samuel Judge for thoughtful discussions and helping me understand the math.
For a complete index of all the StatQuest videos, check out:
[ Ссылка ]
If you'd like to support StatQuest, please consider...
Buying The StatQuest Illustrated Guide to Machine Learning!!!
PDF - [ Ссылка ]
Paperback - [ Ссылка ]
Kindle eBook - [ Ссылка ]
Patreon: [ Ссылка ]
...or...
YouTube Membership: [ Ссылка ]
...a cool StatQuest t-shirt or sweatshirt:
[ Ссылка ]
...buying one or two of my songs (or go large and get a whole album!)
[ Ссылка ]
...or just donating to StatQuest!
[ Ссылка ]
Lastly, if you want to keep up with me as I research and create new StatQuests, follow me on twitter:
[ Ссылка ]
Corrections:
1:16 The Lambda should be outside of the square brackets.
#statquest #xgboost
![](https://i.ytimg.com/vi/ZVFeW798-2I/maxresdefault.jpg)