Tutorial 42 - Ensemble: What is Bagging (Bootstrap Aggregation)?
ฝัง
- เผยแพร่เมื่อ 21 ส.ค. 2019
- Bootstrap aggregating, also called bagging, is a machine learning ensemble meta-algorithm designed to improve the stability and accuracy of machine learning algorithms used in statistical classification and regression
Support me in Patreon: / 2340909
You can buy my book on Finance with Machine Learning and Deep Learning from the below url
amazon url: www.amazon.in/Hands-Python-Fi...
Buy the Best book of Machine Learning, Deep Learning with python sklearn and tensorflow from below
amazon url:
www.amazon.in/Hands-Machine-L...
Connect with me here:
Twitter: / krishnaik06
Facebook: / krishnaik06
instagram: / krishnaik06
Subscribe my unboxing Channel
/ @krishnaikhindi
Below are the various playlist created on ML,Data Science and Deep Learning. Please subscribe and support the channel. Happy Learning!
Deep Learning Playlist: • Tutorial 1- Introducti...
Data Science Projects playlist: • Generative Adversarial...
NLP playlist: • Natural Language Proce...
Statistics Playlist: • Population vs Sample i...
Feature Engineering playlist: • Feature Engineering in...
Computer Vision playlist: • OpenCV Installation | ...
Data Science Interview Question playlist: • Complete Life Cycle of...
You can buy my book on Finance with Machine Learning and Deep Learning from the below url
amazon url: www.amazon.in/Hands-Python-Fi...
🙏🙏🙏🙏🙏🙏🙏🙏
YOU JUST NEED TO DO
3 THINGS to support my channel
LIKE
SHARE
&
SUBSCRIBE
TO MY TH-cam CHANNEL
The way you explained the concepts are very easy and understandable. Keep doing the same. Thanks a lot.
The video explains bagging extremely clearly. Thanks for the upload!
The best explanation of Bagging. I logged in just to write this comment down. Keep it up. Thanks a lot!
just take a moment and appreciate the brilliance of this guy! Once again saved me from reading countless pages...
please keep making videos. Your videos are easy to understand and clears up the concept!
I think if I watch all your playlists I will definitely be confident that I will learn a lot
You the people again proved that TH-cam is the best learning platform. Thank u so much sir for being a part of your TH-cam student❤
0:06-Ensemble Techniques
0:30-Bagging is also called bootstrap aggregation
1:40-In Bagg we divide the data into various samples(on the basis of row sampling with replacement) depending on the number of ML models(Base Learners)known as Bootstrap and then the output which is in majority after running all the models is considered(Voting Classifier)also known as Aggregation
2:53-Row Sampling with replacement
4:02-
4:50-BootStrap
5:14-Aggregation
Krish, this kind of videos are I looking for. The way you are teaching is very much understandable. Thanks for your videos
best explanation ever. You are really good at explaining things. Keep it up. looking forward to more detailed machine learning model videos. Thank you
This is just like "America got talent"/ "India got talent" show. Where we have participant(as data) performing his/her show and there are 4-5 judge(as model) and participant selection is based on judges review.
Its one way perception to learn.
You're so good at explaining !
sir, you are soo good at simplifying and explaining the complex topics
you explained everything in a very simple language......i always watch your videos for machine learning....thank you
You are the only one that explained this right, thank you very much!
I respect your simplicity and reserved a thumb up.
thank you so much for your hard work! this is by far the best explanation I could grab my head around! keep up your good work!!
You are champ!! what an explanation. Thank you so much sir
Perfect explanation!
you just amazing, you have the sipmlest and clearest explanations
Thanks for explaining in simple words.
This is wonderful. Thank you for such a simple and easy understanding explanation sir.
Greatly and deeply-explained. God bless and thanks a lot
thank you very much for very good explanation Krish , wish the best for you
This is actually a really nice explanation. Keep it up.
So easy to understand !!!! Thanks.. Greetings from Brazil
Thank you sir for easy explanation.
Great Explanation .. Thanks Krish
One video and thats it concept clear
Thanks a lot sir
Superb video Keish. once again. Thanks
2:35 m
Best lecture on bootstrapping
It was an amazing explanation! Thank you a lot.
Loved it!!! 💙💙💙
finally i got the concept.hats off sir..
Such a good explanation!
Nice explanation. Thank you!
Great explanation!
Awesome tutorial on BAGGING
Thanku so much sir for a wonderful explaination. Concept of Bootstrap Aggregation is very clear n nicely told. Your channel is very awsome, great videos.
Gem of a tutorial!!!
You are a god for the One day exam preparation students 🙏
Ver Well and Simply explained Krish
Great explanation
Amazing explanation!
Amazing video!
yes, very nicely explained. you are very clear, thank you! :)
You have a great channel man, keep up the good work.
Patrice o'Neal fan?
You have made it super clear. It shows your time investment on the subject. As the saying goes "If you can't explain it to a six-year-old, then you don't understand it yourself” Albert Einstein
thanks a lot ur session was helpful .
Excellent!!!
I love your videos thank you
you are awesome!! Thanks a lot.
amazing content, thank you !
Subscribed.... For ur dedication
Well explained. Thank you. I was getting confused with the textual explanation.
Excellent Krish.
Thanks A Lot Sir!!
Very clear ❤
Awesome krish
really nice video thank u
thank you so much ❤
PS : m>n is how it is. same sample is taken multiple times
Because we are doing the selection with replacement. Example :- Suppose you have a bag with 10 red balls, now you draw 5 balls at a time but with replacement. So after first draw you put all the 5 balls back in the bag now for the next draw you will have 10 balls in total. That's why
Nice explanation
Sitting in my MSc AI class in London and watching him because he is just better!
Nice explaination
Really helpful
I am doing Master's in USA, thank you for this explanation.
Thank you ♥️
This video was very informative, very well explained. Please keep helping students who need help. Be good and take care.
Thanks Krish
Got a clear idea. One small suggestion. It would be nice if you explain the concepts based on some sample datasets consisting of a few rows and cols for explanation sake. That would give more detailed understanding.
This is good explanation
Thanks ❤
Very well explained. I came here because my University professor totally messed up the explanation of this simple technique!
Thank you
Best video
thank you
damn this is an excellent explanation
Very good
Great Explaining! I am in Data Science Masters Program in Data mining class.
Sir, you are simply too much!
Your explanation is awesome ❣️ and thank you so much for making these video for us
I request you to provide your full notes of machine learning so it could be so easy for us ❣️✨it may possible to get high score in machine learning ❣️
Thank you once again ❣️✨.....
Well Explained!! .Sir please make a video explaining the hyperparameter tuning of bagging regressor .How to decide the value of max_feature..
Hi my dude.Thanks a lot for these videos.Every one of your videos is so special avesome.Can I have a question for you.Can u recommend a book which explains like you?
nice one
Wonderful
Thanks
Great title
Good Explanation, is this m1,m2,m3 models are same classifiers or different classifiers
Thnx sir
Love the enthusiasm, love the room presence.. I've seen 3 YT videos explaining Random Forest, and BAgging, this is the clearest one yet.. I still need to find why it is called Boot Strapping, what's the visual analogy that lead to this ??
bootstrap sampling is a statistical term where smaller samples are drawn without replacement from a larger original sample. That is the name attached here in machine learning. If you are asking for what is the literal meaning of bootstrapping, then google search gives the verb definition as "get (oneself or something) into or out of a situation using existing resources" so, here we are getting into using existing resources ( original dataset D). More in-depth I'm not sure sorry. If you get to know pls do comment
Sir are you sure in this example data m
for example we use decision trees with bagging. Which hyperparameters should we set to the trees being used in bagging (which minimum leaf size for the base learner should i choose?). also how many bags should i use ?
Hi @krish .. How to determine the optimal number of trees to be generated in Random Forest algorithm?
What is the difference between bagging, boosting, voting and stacking? Random Forest and XGB can be related with which method
When the base learners are created i.e different models, will the modles be only the ones that are comparible to solve Binary model as you said to cinsider the modle as Binary Regresion ?
I have a doubt, may be i failed to understand it completely, could you please tell me the purpose of aggregation? as we are using different samples for different base classifiers, so the binary output is respective to the samples given, why do we consider the majority of the outcomes?
Can we compare Bootstraping with Cross validation? In both techniques we take some part of the original data and giving to the model.
Can i use GridSearch inside a BaggingClassifier for decision tree estimator inorder to choose a proper hyperparameter for max_leaf_nodes.