If you want to learn more about the bias-variance trade off (an important aspect for both bagging and boosting methods), make sure to check out this video: th-cam.com/video/5mbX6ITznHk/w-d-xo.html
Great video. My understanding is that you would almost always use Bagging, evaluate the results and, if good enough, stop there. However, you COULD go on to try various boosting methods to see if the model improved even more but at what cost? If the best boosted model (Adaboost, XGBoost, etc) performed 1% better but took 3x longer to compute then boosting the already-bagged models might not be worth it right? Still trying to cement in my mind the process flow from a developer standpoint 😉
My dad told me that my girlfriend, who is a data scientist, is a "double bagger". What does that mean? Does performing bagging twice before training mean that you're a pro?
If you want to learn more about the bias-variance trade off (an important aspect for both bagging and boosting methods), make sure to check out this video: th-cam.com/video/5mbX6ITznHk/w-d-xo.html
Neat explanation in 4mins. Keep making small and informative videos like these :)
Thank you so much for your feedback and I am happy you enjoyed it! I will definitely keep making such videos. :)
Watching this one day before exams!.. neatly explained
Thanks! Best of luck to your exams! :)
Crisp visualization & explanation. Loved it!
Thank you! Glad to hear that you liked the explanation! :)
I’m learning ML and these videos are fantastic
Thanks! Glad you think so! :)
Just stumbled across your video, I am looking forward to watching all your videos 😄
I hope you enjoy my content on this channel. Please let me know if you have any feedback on my videos! :)
this was dope, color-coding kinda threw me off but the overall explanation was nice and concise.
Thanks! Glad you enjoyed it! :)
Awsome video! Keep going!
Many thanks!!
simple and easy to understand, nice
Thanks! Glad you found the explanation this way! :)
Very good explanation :)
Thanks! Glad you liked it! :)
It was clear. Thanks
You are welcome! Glad you liked it! :)
Great video. My understanding is that you would almost always use Bagging, evaluate the results and, if good enough, stop there. However, you COULD go on to try various boosting methods to see if the model improved even more but at what cost? If the best boosted model (Adaboost, XGBoost, etc) performed 1% better but took 3x longer to compute then boosting the already-bagged models might not be worth it right? Still trying to cement in my mind the process flow from a developer standpoint 😉
thanks for easy explaination brother
Always welcome! Happy you enjoyed the explanation! :)
great video, loved the pictures. a caveman like myself loves the pictures. thank you lol
Sweet thanks! I am super happy you enjoyed the explanation! :)
awesome video
Thanks! Glad you liked it! :)
Let me "boost" this video by making a comment
Many thanks! I hope you also enjoyed the video! :)
No
Yes he bagged a like and subscribe with this one!
Great explanation ! , Can you share the slides ?
Thanks! Unfortunately I don't have the slides anymore. Changed my laptop and forgot to transfer them. :(
Do you recommend any python libraries for implementing hybrid ml modern that uses boosting for a meta-model? 👏🏼
Thanks for the question! Unfortunately, I don't know about any python libraries that can be used specifically for this. :(
Good Video.
Thanks! Glad you enjoyed it! :)
thank you very much
You're welcome! :)
Boosting using whole set of data?
Yes, why not. in bagging you sample with replacement, but in boosting you can use the whole traiining data. :)
anyone got the answer of WHYS?
hi, are you romanian?
yep, how did you figure that out? :)
the accent :)) big well done and please more videos with machine learning, data science , data analytics :)@@datamlistic
Plsss job dyaa
you need to pronounce "bagging" better in future videos.
Roger that!🫡 English is not my primary language and my pronunciation could be akward sometimes.
Unclear understanding
Why was it unclear? Could you provide a bit of feedback?
My dad told me that my girlfriend, who is a data scientist, is a "double bagger". What does that mean? Does performing bagging twice before training mean that you're a pro?