Random Forests : Data Science Concepts

แชร์
ฝัง
  • เผยแพร่เมื่อ 1 มิ.ย. 2024
  • How do random forests work?
    Decision trees video: • Decision Trees
    Decision tree pruning video: • Decision Tree Pruning
    Overfitting video: • Overfitting

ความคิดเห็น • 90

  • @saitaro
    @saitaro 3 ปีที่แล้ว +44

    2-hour lecture in 15 minute. ritvik rocks.

  • @azarel7
    @azarel7 2 ปีที่แล้ว +4

    Great video.
    1) Spoke well and explained the concepts clearly
    2) Threw and caught the marker every time, with no interruption in speech while doing so.
    Bravissimo!

  • @mosama22
    @mosama22 2 ปีที่แล้ว +25

    I'm studying Data Science at MIT, you really can't imagine man how much "ritvikmath" is helping me, and a couple more channels, before I start any topic I like to tackle it first or just take a general idea, and you can't imagine how much your videos helped! Short, concise, and to the point! Thank you man 🙂
    Just one notice, It might be a good idea to choose an easy to remember / clear channel name, sometimes when I'm talking to someone, it is almost impossible to remember the name of your channel, just a clear name with spaces! Thank you again 🙂

  • @jpark7636
    @jpark7636 3 ปีที่แล้ว +24

    This is the best video to understand random forest in TH-cam so far for me :))

  • @alsjeu
    @alsjeu หลายเดือนก่อน +1

    i reeeeally liked the pen flip at 3:27!! keep up the great work!

    • @ritvikmath
      @ritvikmath  หลายเดือนก่อน +1

      Thank you so much!

  • @ashmitas
    @ashmitas 21 วันที่ผ่านมา

    thanks, well explained to a beginner like me. appreciate how a complex method was easily explained using a basic whiteboard and a relatable example.

  • @edmundoguerramendoza7465
    @edmundoguerramendoza7465 3 ปีที่แล้ว +4

    Ritvik, once again you do an amazing job simplifying concepts in short periods of time, while still making them very understandable. Thanks!!

  • @smishra115
    @smishra115 2 ปีที่แล้ว +1

    all hail the marker juggler and his short, crisp, easy to understand videos! Keep it up dude.

  • @juliocjacobo
    @juliocjacobo 3 ปีที่แล้ว +4

    Concise and right to the point, as always. Thank you!

  • @MayankGoel447
    @MayankGoel447 3 ปีที่แล้ว +7

    Thanks for the video! This is definitely the best explaination of Random Forest I have seen yet. I'm really enjoying learning Data Science from you

    • @ritvikmath
      @ritvikmath  3 ปีที่แล้ว +1

      Awesome, thank you!

  • @jarrelldunson
    @jarrelldunson 3 ปีที่แล้ว +13

    Ritvik, hey, thank you... this was really, really helpful - a great explanation, Jarrell

    • @ritvikmath
      @ritvikmath  3 ปีที่แล้ว +2

      Glad it was helpful!

  • @jimlanzi6802
    @jimlanzi6802 2 ปีที่แล้ว +1

    Very well organized and well put together. Simplified enough for the medium, but included just the right amount of detail to guide one in their further pursuits of the topic. Thank you.

  • @internetuser2399
    @internetuser2399 2 ปีที่แล้ว +2

    this is some high quality content. you deserve more views! great teacher.

  • @preetikharb8283
    @preetikharb8283 2 ปีที่แล้ว +1

    This is THE BEST explanation of Random forest!! Thank you Ritvik :)

  • @BO2Letsplay
    @BO2Letsplay 6 หลายเดือนก่อน

    I'm trying to learn some ML content as it relates to classification to quite a large degree, and just want to say that this video on Random Forest is one of the only ones that actually made sense to me as a layman! Thank you

  • @qiguosun129
    @qiguosun129 2 ปีที่แล้ว +2

    Great lecture, help me recall random forest when I am learning the causal forest

  • @storyteller1900
    @storyteller1900 ปีที่แล้ว

    This is an amazing class. It contains all the important parts of random forests.

  • @sourishguntipally8257
    @sourishguntipally8257 6 หลายเดือนก่อน

    This was an amazing video and super well made. It's astonishing how this material is free to learn from!

  • @loveena419
    @loveena419 3 ปีที่แล้ว

    Wow great explanation - I am hooked on these videos. Get the main points in a short timeframe - would be nice to have a video on Tuning RF and other ML algorithms. And the pre-req videos are very useful to have the right background to understand this one.
    Thank you!

  • @Chillos100
    @Chillos100 2 ปีที่แล้ว +1

    He’s simply the best!! Thanks for all your effort

  • @ForcesOfOdin
    @ForcesOfOdin ปีที่แล้ว

    Loved the interpretability of the random forests idea! Very clever / useful. I'm guessing that you would want to reshuffle the dth feature for each i to avoid the effect that the shuffled data accidentally correlates with an important feature.

  • @user-lh8wy5yb2x
    @user-lh8wy5yb2x ปีที่แล้ว

    Fabulously concise and accurate!!!

  • @yuqingliu8412
    @yuqingliu8412 ปีที่แล้ว

    My favorate and best teacher in TH-cam !

  • @t_geek9211
    @t_geek9211 3 ปีที่แล้ว +6

    Wow! You are really good at explaining stuff! That was amazing!

    • @ritvikmath
      @ritvikmath  3 ปีที่แล้ว +2

      Glad you think so!

  • @hameddadgour
    @hameddadgour ปีที่แล้ว

    Fantastic presentation!

  • @nguyenlam7227
    @nguyenlam7227 ปีที่แล้ว

    Big Thanks for a clear explanation!!!

  • @TheBalhamboy
    @TheBalhamboy 3 ปีที่แล้ว

    Just found your channel. Really well explained. Thanks :)

  • @leoliao3389
    @leoliao3389 8 หลายเดือนก่อน

    Thank you ritvik!! This video is so helpful!!

  • @joycwang
    @joycwang 2 ปีที่แล้ว

    great explanation much easier to understand

  • @millenaalves4169
    @millenaalves4169 หลายเดือนก่อน

    what an awesome video! congrats, really helpful

  • @danspeed93
    @danspeed93 2 ปีที่แล้ว

    First time I see this way of computing feature importance, thanks!

  • @squib3083
    @squib3083 10 หลายเดือนก่อน

    Awesome explanation thank you

  • @extcresources531
    @extcresources531 2 ปีที่แล้ว

    This is gold.. pure gold!!

  • @haninalkabbani7766
    @haninalkabbani7766 2 ปีที่แล้ว +1

    I can't describe how good your explanation is !!!

  • @vladimirkirichenko1972
    @vladimirkirichenko1972 ปีที่แล้ว

    excellent vid! thank you.

  • @jamesbrown6591
    @jamesbrown6591 ปีที่แล้ว

    This is the best explanation I’ve found, thank you 🙏

    • @ritvikmath
      @ritvikmath  ปีที่แล้ว

      Glad it was helpful!

  • @jasdeepsinghgrover2470
    @jasdeepsinghgrover2470 3 ปีที่แล้ว +2

    Amazing video... You can also cover parts like random projections... That's something which can make them much more interesting.

    • @ritvikmath
      @ritvikmath  3 ปีที่แล้ว

      That's a great idea!

  • @roopanjalijasrotia3946
    @roopanjalijasrotia3946 2 ปีที่แล้ว

    This is great! How about a point or two about the pitfalls of using random forest for time series

  • @kevinshao9148
    @kevinshao9148 หลายเดือนก่อน

    Thanks for the great video! Do you have a video or any recommend for RandomForest on Regression math derivation? Thank you!

  • @dinhnguyenvo3040
    @dinhnguyenvo3040 3 ปีที่แล้ว +1

    You are godly easy to follow, big thank you from my heart

    • @ritvikmath
      @ritvikmath  3 ปีที่แล้ว

      Glad I could help!

  • @karunasrees7402
    @karunasrees7402 ปีที่แล้ว

    Thanks Professor , your explanation is very good. I am really enjoying your videos and they are helping me to focus on DS. I have seen many videos prior they only mention Idea 1 - Bagging and say it is Random Forests. But you have mentioned Idea 2 - Random Subspaces as well. Just to confirm on it , do the Random Forests use both the ideas ? Do you mean that Bagging + Random Subspaces = Random Forest ? If possible can you explain how to code it ? Thanks for your time on videos ! Many of your videos are good , even your Bias-Variance video is also super.

  • @Fat_Cat_Fly
    @Fat_Cat_Fly 3 ปีที่แล้ว +2

    amazing video!! really helpful, thanks!!!

    • @ritvikmath
      @ritvikmath  3 ปีที่แล้ว

      Glad it was helpful!

  • @NikBearBrown
    @NikBearBrown 3 หลายเดือนก่อน

    The algorithm described is random sampling, not bagging. Not bootstrap samples are being created as described.

  • @nelsonk1341
    @nelsonk1341 ปีที่แล้ว

    Best DS TH-camr

  • @TheMarComplex
    @TheMarComplex 2 ปีที่แล้ว

    Thank you!

  • @annikamoller7673
    @annikamoller7673 3 ปีที่แล้ว +1

    what a great explanation, thanks man :)

  • @janpieterwagenaar1608
    @janpieterwagenaar1608 3 ปีที่แล้ว

    Ritvikmath, i would like to complement you with the clear direct explanation video's. you make it easily accessable and clear with practical examples. please keep it up.
    Kind regards,
    Jan Pieter Wagenaar

    • @ritvikmath
      @ritvikmath  3 ปีที่แล้ว

      You are most welcome!

  • @neuodev
    @neuodev 2 ปีที่แล้ว

    You are aweomse!

  • @keshavsharma267
    @keshavsharma267 3 ปีที่แล้ว +5

    Thanks for the video. can you also explain interpretability via LIME and Shapely values?

    • @ritvikmath
      @ritvikmath  3 ปีที่แล้ว

      Great suggestion!

  • @ahmetcihan8025
    @ahmetcihan8025 2 ปีที่แล้ว

    This is it. Thank you so much .

  • @YingjieWu-dt8vm
    @YingjieWu-dt8vm 7 หลายเดือนก่อน

    great video

  • @samuelharris4509
    @samuelharris4509 ปีที่แล้ว

    Why do we need the accuracy value on the 20% for each tree? Does that help with some weighted average?

  • @manueltiburtini6528
    @manueltiburtini6528 2 ปีที่แล้ว

    I love it!

  • @niknoor4044
    @niknoor4044 2 ปีที่แล้ว

    Great!

  • @SethuIyer95
    @SethuIyer95 6 หลายเดือนก่อน

    Using associative rule mining and extracting all the rules from all decision trees we can interpret random forests.

  • @Ostiosti
    @Ostiosti 3 ปีที่แล้ว

    Great video. But why permute on the training data and not on the test data? This should also show the importance of the feature, right?

    • @beniborukhov9436
      @beniborukhov9436 3 ปีที่แล้ว

      I think that it's since we're trying to focus specifically on the importance of each feature to the model. We're avoiding adding the additional variable of how well the model generalizes and therefore works on the test data so we can see the features' contribution to the model's accuracy under ideal conditions.

  • @ziaurrahmanutube
    @ziaurrahmanutube 3 ปีที่แล้ว +3

    Love your videos, very helpful and well explained

    • @ritvikmath
      @ritvikmath  3 ปีที่แล้ว

      Happy to hear that!

  • @SuperHddf
    @SuperHddf 2 ปีที่แล้ว

    THX!

  • @KennieinKorea
    @KennieinKorea 10 หลายเดือนก่อน

    impressive, thanks

    • @ritvikmath
      @ritvikmath  10 หลายเดือนก่อน +1

      Glad you liked it!

  • @TawhidShahrior
    @TawhidShahrior 2 ปีที่แล้ว

    you are a legend

  • @beshosamir8978
    @beshosamir8978 ปีที่แล้ว

    u r incredibly amazing ,but i have 2 questions :
    1- What is the meaning of when i use all features the tree will be correlated to each other, i know what is the meaning of 2 features are correlated ,but what is mean when i say 2 trees are correlated ?????
    2- when i need to determine how much a specific feature is important now , i trained the model using 80% of the dataset and now do i get the accuracy of this (80% dataset) of the dataset and after that shuffle my specific column and get the accuracy again of 80% of the data after shuffling then subtract them ? or i'm using 20% for both ? but u said in the video u r get the accuracy of the data that made that tree so u almost talking about the 80% , it make no sense for me using 20% of the dataset

  • @trin1721
    @trin1721 2 ปีที่แล้ว

    Can't we get the feature importance for free, without permuting, by looking at the accuracies of models trained with and without certain features (in the random subspace step)?

  • @geoffreyanderson4719
    @geoffreyanderson4719 2 ปีที่แล้ว

    Yo I heard the RF is bad alone and needs help when: 1) a strongly predictive linear feature exists in X. You gotta help the RF out by either feeding it the residuals from running the linear model on that feature first, so each model in the ensemble can do what it does best, the linear doing linear things and the nonlinear RF doing nonlinear things. Or else just preprocess to create an additional feature which is just the output of the linear model, and give the whole augmented feature set to the RF now. 2) 2nd order associations are expected to be important, because despite its subsampling of feature space, the RF is actually NOT good at automatically finding 2nd order predictive associations in X. THus we should help the RF out by doing some feature engineering of the 2nd order terms in advance into the X and then give it to the RF NOW. Further it might help still more by telling RF to stop using the typical 0.5 ratio default of subspace sampling and instead just focus on exactly 2 columns at a time, no more, no less, forcing it to look much closer at all the 2nd order associations that you expect should be found by the RF.
    These are hear-say and hypotheses. It would be cool to see how to do it in sklearn's pipeline on a dataset like "jewellery" which is used for demo code by the pycaret library. Jewellery has a strongly predictive feature "carets" or "weight" in its X. But they just look at trees alone in their model search, so I think it can be improved by helping out the fancy nonlinear tree models as described above.

  • @warpathcucucu
    @warpathcucucu 2 ปีที่แล้ว

    your're the goat

  • @Fordalo
    @Fordalo ปีที่แล้ว

    you are goated

  • @leroychiyangwa8320
    @leroychiyangwa8320 ปีที่แล้ว

    kkkk i like the entrance style

  • @TheProblembaer2
    @TheProblembaer2 8 หลายเดือนก่อน

    I love you.

  • @Makako_Loko
    @Makako_Loko 24 วันที่ผ่านมา

    Oh my deus thank you

  • @nilankar3873
    @nilankar3873 ปีที่แล้ว

    🍺

  • @beatss8286
    @beatss8286 3 ปีที่แล้ว +1

    Thank you!