Ridge Regression for Beginners!

แชร์
ฝัง
  • เผยแพร่เมื่อ 8 ม.ค. 2025

ความคิดเห็น •

  • @agila.p9807
    @agila.p9807 2 ปีที่แล้ว +20

    You have the skill to simplify a complex topic which can be understood by everyone. Please continue your great work. This world needs more teachers like you.

  • @Kmysiak1
    @Kmysiak1 5 ปีที่แล้ว +21

    I've watched dozens of videos on regularization and your explanation is perfect! thanks!

  • @jgianan
    @jgianan ปีที่แล้ว +4

    Wow! It took me several rewinds to understand that from my professor and I got it in 3 mins with the way you explained and visualized it! Thank you!

  • @mthandenimathabelacap5466
    @mthandenimathabelacap5466 หลายเดือนก่อน +1

    Clear explainantion of the Ridge() model. Very intuitive. SUBSCRIBED.

  • @shroukezz8878
    @shroukezz8878 ปีที่แล้ว +1

    the best explanation for the ridge regression I have ever listen

  • @HL-dw4dl
    @HL-dw4dl 3 ปีที่แล้ว +3

    Great video for people like me who are beginners and don't want to go deep in the Statistics part of it but a simple explanation for data science. 🧡 from India.

  • @ksh2106
    @ksh2106 ปีที่แล้ว +1

    Thank you for explaining bias and variance and not just moving forward without the explanation!!

  • @winniesebapu1364
    @winniesebapu1364 2 ปีที่แล้ว +1

    You explained it in simple way and with a short video. very effective

  • @goodwavedata
    @goodwavedata 4 ปีที่แล้ว +2

    I loved this video. I've heard about "reducing the coefficient values" in so many other places, but you explained the 'why' behind this better than any of the others that I saw.

  • @spider279
    @spider279 2 ปีที่แล้ว +1

    Wow , your explaination are too good, it's my first time seeing your video and i'm really satisfied

  • @MarcoBova
    @MarcoBova 6 หลายเดือนก่อน

    Really a pristine work, in explaining the ideas behind the concept. I found it really useful for having an overview look before dealing with all the math behind. Thanks

  • @Unbiased27
    @Unbiased27 4 ปีที่แล้ว +1

    The best explanation I've heard on ridge regression. Straightforward and precise! Thank you very much!

  • @2904sparrow
    @2904sparrow 4 ปีที่แล้ว +2

    Very well explained, finally i got it! Many thanks.

  • @gzuzchuy505
    @gzuzchuy505 4 ปีที่แล้ว +2

    Perfect explanation!!
    You explained it in simple way and with a short video.
    Thanks, keep the good work

  • @ThePiratefan96
    @ThePiratefan96 11 หลายเดือนก่อน +1

    Very helpful! Thank you Professor!

  • @khaledsherif7056
    @khaledsherif7056 4 ปีที่แล้ว

    I like how you explained that well in a 7 min video.

  • @sues4370
    @sues4370 ปีที่แล้ว +1

    Thank you! This is a very helpful explanation and visualization of ridge regression.

  • @wingyanwong3208
    @wingyanwong3208 3 ปีที่แล้ว +1

    Very good explanation. Thank you. It gives me the idea of ridge regression.

  • @Reglaized
    @Reglaized 2 ปีที่แล้ว +1

    Great explanation! Thank you!

  • @fergavilan132
    @fergavilan132 2 ปีที่แล้ว

    The only and first video that allowed me to understand this shit. Thanks!!

  • @jeanyeager4252
    @jeanyeager4252 ปีที่แล้ว +1

    Thank you for the quick and easy to understand tutorial

  • @zhannadruzhinina4235
    @zhannadruzhinina4235 ปีที่แล้ว +1

    This is a great video, thank you!

  • @palaknath
    @palaknath 4 ปีที่แล้ว +1

    Thank you Sir! the great explanation made the concept seem so easy!

  • @ThuyTran-bw7dq
    @ThuyTran-bw7dq 2 ปีที่แล้ว +1

    Thank you sir, it's so simple!

  • @cdhaifule
    @cdhaifule 3 ปีที่แล้ว +1

    Wonderful explanation. Thank you.

  • @FaisalR-n4z
    @FaisalR-n4z ปีที่แล้ว +1

    Amazing explanation, thanks ryan

  • @geneticengineer7720
    @geneticengineer7720 4 ปีที่แล้ว +3

    You made it easy to understand. But where do you get the alpha and slope? From the testing data set? Then the testing data set becomes the training data set.

  • @ekleanthony7997
    @ekleanthony7997 3 ปีที่แล้ว +1

    Awesome Explanation. thanks!

  • @shivu.sonwane4429
    @shivu.sonwane4429 3 ปีที่แล้ว

    In ridge regression alpha never be 0 . ☺️ Easy and clear explanation

  • @SajidHussain-dt7ci
    @SajidHussain-dt7ci 2 ปีที่แล้ว +1

    really appreciate your effort thanks for help!

  • @yl3046
    @yl3046 5 ปีที่แล้ว

    Good Intuition. Contradicting in the slides whether ridge regression increase/decrease for bias and variance.

  • @Luckys1191
    @Luckys1191 2 ปีที่แล้ว +1

    Good Explanation....

  • @benuploads7964
    @benuploads7964 2 ปีที่แล้ว

    amazing explanation!

  • @kislaykrishna8918
    @kislaykrishna8918 3 ปีที่แล้ว +1

    great crystal clear

  • @cesar3550
    @cesar3550 4 ปีที่แล้ว

    Great video and great english as well, you gained a new sub

  • @faizanzahid490
    @faizanzahid490 4 ปีที่แล้ว +12

    Really appreciate the tutorial, just one query, Does regularisation always reduce the slope? I mean i think it's possible for the test dataset to have more slope than training set.

    • @marcelocoip7275
      @marcelocoip7275 2 ปีที่แล้ว

      Black hole here... Looking for this answer...

    • @KrishnenduJ-hc5fg
      @KrishnenduJ-hc5fg ปีที่แล้ว

      Regularisation minimises the sum of squared errors while also minimising the sum of squared magnitudes of the coefficients. This pushes the ridge coefficients closer to zero. But yes, if the penalty term is too small, the slope may resemble that of OLS.

    • @KrishnenduJ-hc5fg
      @KrishnenduJ-hc5fg ปีที่แล้ว

      So it is highly unlikely for regularisation to increase the slope than that of OLS.

  • @tyman1449
    @tyman1449 5 ปีที่แล้ว +3

    Thank you for your short video. But I did not understand why we should minimize the slope. It is just a possibility and depends on test data. You may increase the slope to get minimum residuals.

    • @SumitKumar-uq3dg
      @SumitKumar-uq3dg 4 ปีที่แล้ว

      Minimizing or maximizing is decided after looking at the total errors. If maximizing increases the error then we will go to minimizing the slope.

  • @talkingabout-h8d
    @talkingabout-h8d 4 ปีที่แล้ว

    Thank you for this *great explanation*

  • @malinyamato
    @malinyamato ปีที่แล้ว +1

    great intro !

  • @quant-prep2843
    @quant-prep2843 3 ปีที่แล้ว +3

    what if model needs high sensitivity to dependent variable ?

  • @Muziekmixen
    @Muziekmixen 4 ปีที่แล้ว

    Suggestion: You explained very well Ridge & Lasso Regression, make also one for Elastic Net!

  • @chintunannepaga579
    @chintunannepaga579 3 ปีที่แล้ว

    excellent concept explanation.. thank you

  • @Cherdanye
    @Cherdanye 4 ปีที่แล้ว +2

    5:01 door opens

  • @leiyarabe7482
    @leiyarabe7482 4 ปีที่แล้ว

    👏👏👏👏👏👏 well explained!

  • @ramspalla8036
    @ramspalla8036 2 ปีที่แล้ว

    Hi Ryan, Can you please do a video on Elastic Net Regression?

  • @VBeniwal_IITKGP
    @VBeniwal_IITKGP 4 ปีที่แล้ว

    Thank you sir🙏🙏

  • @yugoeugis6733
    @yugoeugis6733 4 ปีที่แล้ว +2

    Education is about pedagogy. Who teaches. Here's a good one.

  • @adrianaayluardo8583
    @adrianaayluardo8583 5 ปีที่แล้ว

    Thank you!

  • @spandangude8973
    @spandangude8973 3 ปีที่แล้ว

    thank you for the video. do you speak Farsi ?

  • @raedos1
    @raedos1 3 ปีที่แล้ว

    It just feels like a fancy way to include your testing set into your training set, essentially making 100% of your data a trainingset. What is the difference between those?

  • @NoelGeorge-l4g
    @NoelGeorge-l4g ปีที่แล้ว

    How does increasing Lambda trem reduces the slope. We are multiplying Lambda with Slope right, which is constant?

  • @Pankaj_Khanal_Joshi
    @Pankaj_Khanal_Joshi ปีที่แล้ว

    Sir how do we know that during regularization we have to increase or decrease the slope.

  • @marcelocoip7275
    @marcelocoip7275 2 ปีที่แล้ว

    But how ridge works if the variance decrease with a steeper slope?

  • @Frdy12345
    @Frdy12345 ปีที่แล้ว

    Isn’t alpha actually lambda?

  • @Nimkrox
    @Nimkrox 5 ปีที่แล้ว +5

    The explaination is good, but I think that your example could be better. Having 3 points in the training set and 5 points in testing set is not a good practise. Also your 3 training points will give the same line every time, so again: not the best example

    • @deathwiddle3826
      @deathwiddle3826 11 หลายเดือนก่อน

      your such a hater😢

    • @a7med7x7
      @a7med7x7 6 หลายเดือนก่อน

      The example is perfect, it is for illustration, and textbooks use the same amount for training data points, it’s better to emphasize the idea of more testing data points to show the mainstream and pattern of the data, in reality, the dataset you use will never be as much as the samples it was testing or seen on.
      The 3 similar training data points are the same reason why the problem occurs, and the ideal mechanism for solving it is to deviate your model from it.