A Critical Skill People Learn Too LATE: Learning Curves In Machine Learning.

แชร์
ฝัง
  • เผยแพร่เมื่อ 16 ม.ค. 2025

ความคิดเห็น • 73

  • @underfitted
    @underfitted  2 ปีที่แล้ว +15

    Hey everyone! Hope you enjoyed the video and it's helpful for you! Make sure to SMASH that like button and Subscribe to help the channel grow. The next video will be sickkkkk! See you then.

  • @chuckcheddar461
    @chuckcheddar461 11 หลายเดือนก่อน +25

    My left ear really loved this video

    • @lucid5890
      @lucid5890 11 วันที่ผ่านมา

      fr

  • @spaicersoda7165
    @spaicersoda7165 ปีที่แล้ว +15

    My left ear loved this video.

    • @victorhplus
      @victorhplus 5 หลายเดือนก่อน

      thanks bro, I thought my headphones were the problem

  • @kozaTG
    @kozaTG 8 หลายเดือนก่อน +5

    if you can't explain it to a 6 year old, you don't understand it yourself. brilliant video

  • @kraeftigerkanacke
    @kraeftigerkanacke ปีที่แล้ว +3

    I have read a lot of articles and watched a few tutorials. But THIS is the perfect explanation for beginners in the ML field. Thank you very much.

  • @orbitinggeek4000
    @orbitinggeek4000 2 ปีที่แล้ว +12

    What an amazing explanation ! It shows how well you yourself understood it - so glad I found you on Twitter !!

    • @underfitted
      @underfitted  2 ปีที่แล้ว +1

      Thanks! Really appreciate it!

  • @chidubem31
    @chidubem31 2 ปีที่แล้ว +4

    Straigtht to the point. I honestly like how you talk more of theory and analysis rather than code.

  • @agenticmark
    @agenticmark ปีที่แล้ว

    That was easily the best explanation of learning curves. I have seen each of those, except the perfect curve, but I will keep trying!

  • @VelazquezJFP
    @VelazquezJFP ปีที่แล้ว

    I belive you have thought so many artificial brains that you know how to get information in the most slow human brains out there. You repeat the fundamentals with a different tone, get letters on screen and give it time to absorve ensuring there is no overfit or underfit in my learning today.
    Great job and you got a new subscriber.

  • @mohammadmassri2394
    @mohammadmassri2394 ปีที่แล้ว

    Best explanation ever on TH-cam! Keep it up man!

  • @jacemi
    @jacemi 2 ปีที่แล้ว +3

    Excellent analogy 👏 Thank you very much Santiago!! Your videos are so cool and to the point !!

    • @underfitted
      @underfitted  2 ปีที่แล้ว +1

      Glad you like them, Javier! Really appreciate your comment.

  • @JordiRosell
    @JordiRosell 2 ปีที่แล้ว +2

    Thanks for this video. I would say validation set, cross-validation sets or resamples instead of testing. But the main ideas are the same. I only use the holdout set once for the last fit and some people can misinterpret some concepts here.

    • @underfitted
      @underfitted  2 ปีที่แล้ว +3

      Absolutely! This is exactly what a validation set looks like. I wanted to keep it simple and talk about only 2 sets for the sake of the video, but you are correct.

  • @AB-cd5gd
    @AB-cd5gd 8 วันที่ผ่านมา

    Best explanation on the entire youtube

  • @paulallen1597
    @paulallen1597 2 ปีที่แล้ว +2

    Superb explanation of the what is the problem and how to approach solving it.

    • @underfitted
      @underfitted  2 ปีที่แล้ว

      Glad it was helpful, Paul! Really appreciate your comment.

  • @tehsupervik
    @tehsupervik 2 ปีที่แล้ว +3

    Great quality of information and really precise. So helpful for a beginner like me

    • @underfitted
      @underfitted  2 ปีที่แล้ว

      Thanks, Victor! I'm glad this is helpful!

  • @Singasongwithme2004
    @Singasongwithme2004 5 หลายเดือนก่อน

    Teaching style is too unique and too good 👍

  • @msfasha
    @msfasha ปีที่แล้ว

    Great explanation with lots of illustrations, simply a very good job, keep going.

  • @FrocketGaming
    @FrocketGaming ปีที่แล้ว

    This was very helpful but how do we define 'high' and 'low' loss? It's relative I assume but is there some rule of thumb?

  • @Fantalic
    @Fantalic 7 หลายเดือนก่อน

    this is so good. so much important information in short time. big thanks. :)

  • @muzammilrizvi6424
    @muzammilrizvi6424 ปีที่แล้ว

    Can't describe how helpful and beautiful this video is, simply Amazing.

  • @ScottSavageTechnoScavenger
    @ScottSavageTechnoScavenger 2 ปีที่แล้ว

    YES!!!! You just solved a problem I ran into years ago!

  • @bellion166
    @bellion166 ปีที่แล้ว

    Thank you! That helped a lot!!

  • @peacefulmusic3908
    @peacefulmusic3908 หลายเดือนก่อน

    why my loss curve start very low at begining epochs (at epoch 02: loss 0,007) but when I see evaluate mecrics (MAE, RMSE) very bad ( 0,94 ; 1,12)? I'm doing regression problem, please help me!

  • @dilshanpieris9439
    @dilshanpieris9439 ปีที่แล้ว

    understood evey bit of it, well done brother ❤❤

  • @rewiredbyadhd
    @rewiredbyadhd 2 ปีที่แล้ว +1

    This is one of the best Machine Learning channels I've seen. Thanks Santiago 🙏🏻 You have a new subscriber. I came here from Twitter, your content there is super good❤️🙏🏻
    Please keep making explanatory videos with simple language, so anyone can understand.
    Thank you again🙏🏻

  • @NavyaVedachala
    @NavyaVedachala 9 หลายเดือนก่อน

    This was so helpful! Thank you

  • @starreachsocietybw
    @starreachsocietybw 11 หลายเดือนก่อน

    Best Explanation Ever!!!!!!!!!!!!!

  • @burhanrashidhussein6037
    @burhanrashidhussein6037 2 ปีที่แล้ว +1

    Qn. Can we use the concept of overfiting to understand how good is our training data? Eg. What if we can not overfit our training data? Can we say our data is not good enough to train the model?

    • @underfitted
      @underfitted  2 ปีที่แล้ว +2

      If you can't overfit your model, you might have a problem with the data, yes (it could also be a problem with the model itself, of course.)

    • @hiankun
      @hiankun 2 ปีที่แล้ว

      @@underfitted AFAIK, if a model is to simple then we cannot overfit it with the dataset. Right?

  • @atakanbilgili4373
    @atakanbilgili4373 ปีที่แล้ว +1

    Great explanations, especially useful when you work on your own dataset rather than the Kaggle ones.

    • @jamespaz4333
      @jamespaz4333 ปีที่แล้ว

      One of the things that Kaggle taught me is how to avoid over-fitting.

  • @Ninja1Dark159
    @Ninja1Dark159 10 หลายเดือนก่อน

    Q. If my problem was binary classification problem and Im using built in model like decision tree or random forest. How to get the loss function ? And If it is not possible or not logical, does drawing learning curve with y axis as accuracy and x axis as number of training samples can still identify the overfitting and underfitting ?

  • @mtomazza
    @mtomazza 2 ปีที่แล้ว +1

    Absolutely brilliant!

  • @junaidmohammad1967
    @junaidmohammad1967 2 ปีที่แล้ว +1

    what does it mean when test loss is lower than training loss?

    • @underfitted
      @underfitted  2 ปีที่แล้ว +3

      It usually means the test set is too easy for the model. You might have too few test samples, or the samples might be both in the training and testing sets, or they might be too simple for the model.

  • @otakusil69
    @otakusil69 8 หลายเดือนก่อน

    Short-Sharp-Understandable

  • @Malikk-em6ix
    @Malikk-em6ix 2 ปีที่แล้ว

    Great Insights. Very helpful, Thank you.

  • @artistoryartdesign
    @artistoryartdesign ปีที่แล้ว

    Like this so much 👍👍👍

  • @AB-cd5gd
    @AB-cd5gd 8 วันที่ผ่านมา

    Thanks

  • @Jobic-10
    @Jobic-10 ปีที่แล้ว

    This is too good❤❤

  • @Param3021
    @Param3021 2 ปีที่แล้ว +1

    That beat 💓

  • @allgasnobrakeseul
    @allgasnobrakeseul 2 หลายเดือนก่อน

    holy shit u just explained this so well

  • @cbr_n
    @cbr_n 2 ปีที่แล้ว

    Hey! Solid video, had a slight recommender; balance the audio out a bit, I think L is 10-20% louder than R.

    • @underfitted
      @underfitted  2 ปีที่แล้ว

      Thanks, CB! Yeah, I’ve been trying to improve the audio for a few videos now. I think the last video came out better? Thanks for the feedback!

  • @manasmhatre8200
    @manasmhatre8200 11 หลายเดือนก่อน

    Please tell what should be the max difference between Val loss and training loss so that model is not overfitting, my model is showing training loss - 1.46 x 10^(-05) and Val loss - 0.018. So is the model overfitting ? Anyone reply

  • @meryemamaouche8158
    @meryemamaouche8158 ปีที่แล้ว

    great explanation

  • @okotpascal
    @okotpascal 2 ปีที่แล้ว +1

    oh no,... someone must have complained about the sound right?🤨... I'm going to miss the loud videos, I loved them louder. It always woke up my attention😒😒

    • @underfitted
      @underfitted  2 ปีที่แล้ว +3

      They did! But now they aren't clipping, which was a problem.

  • @alaeldinabdulmajid6576
    @alaeldinabdulmajid6576 6 หลายเดือนก่อน

    Oriented - great job👍

  • @almerandomendezjr.4742
    @almerandomendezjr.4742 ปีที่แล้ว

    amazin explanation thanks

  • @lokeshsharma4177
    @lokeshsharma4177 8 หลายเดือนก่อน

    YAA - You Are Awesome

  • @alirezanorouzi8924
    @alirezanorouzi8924 2 ปีที่แล้ว

    hey man im not understand roc curves inn logreg

  • @johnpan4789
    @johnpan4789 2 ปีที่แล้ว

    So top suggestion for a ML book

  • @Anonyms-rt5fb
    @Anonyms-rt5fb หลายเดือนก่อน

    A-mazing Video

  • @TheDarkestForce
    @TheDarkestForce 6 หลายเดือนก่อน

    Anyone else like me who has a decent set of speakers is probably greatly annoyed and distracted my the base thumps you put in the background. I have no idea why you would put that in the audio stream. The video explanation was well done but I couldn't continue watching for the reason I mentioned.

    • @underfitted
      @underfitted  6 หลายเดือนก่อน

      That’s the type of thing you do when you are learning.

  • @jasbarlegaspina1220
    @jasbarlegaspina1220 2 ปีที่แล้ว

    17 minutes than i've seen in years.

  • @salmanqafarov9556
    @salmanqafarov9556 ปีที่แล้ว

    İndian dominance is so much high in IT now, white guys make educational videos in indian accent.

  • @ahmadbodayr7203
    @ahmadbodayr7203 ปีที่แล้ว +2

    read about islam man