Bayesian Linear Regression : Data Science Concepts

แชร์
ฝัง
  • เผยแพร่เมื่อ 30 มี.ค. 2021
  • The crazy link between Bayes Theorem, Linear Regression, LASSO, and Ridge!
    LASSO Video : • Lasso Regression
    Ridge Video : • Ridge Regression
    Intro to Bayesian Stats Video : • What the Heck is Bayes...
    My Patreon : www.patreon.com/user?u=49277905

ความคิดเห็น • 168

  • @brycedavis5674
    @brycedavis5674 3 ปีที่แล้ว +28

    As soon as you explained the results from Bayesian my jaw was wide open for like 3 minutes this is so interesting

  • @kunalchakraborty3037
    @kunalchakraborty3037 3 ปีที่แล้ว +42

    Read it on a book. Didn't understand jack shit back then. Your videos are awesome. Rich, small, consise. Please make a video on Linear Discriminant Analysis and how its related to bay's theorem. This video will be saved in my data science playlist.

  • @tobias2688
    @tobias2688 3 ปีที่แล้ว +61

    This video is a true gem, informative and simple at once. Thank you so much!

    • @ritvikmath
      @ritvikmath  3 ปีที่แล้ว

      Glad it was helpful!

  • @fluidice1656
    @fluidice1656 ปีที่แล้ว +3

    This is my favorite video out of a large set of fantastic videos that you have made. It just brings everything together in such a brilliant way. I keep getting back to it over and over again. Thank you so much!

  • @sambacon2141
    @sambacon2141 2 ปีที่แล้ว

    Man! What a great explanation of Bayesian Stats. It's all starting to make sense now. Thank you!!!

  • @rajanalexander4949
    @rajanalexander4949 10 หลายเดือนก่อน +1

    This is incredible. Clear, well paced and explained. Thank you!

  • @tj9796
    @tj9796 2 ปีที่แล้ว

    Your videos are great. Love the connections you make so that stats is intuitive as opposed to plug and play formulas.

  • @feelmiranda
    @feelmiranda 2 ปีที่แล้ว +1

    Your videos are a true gem, and an inspiration even. I hope to be as instructive as you are if I ever become a teacher!

  • @MoumitaHanra
    @MoumitaHanra 2 ปีที่แล้ว

    Best of all videos on Bayesian regression; other videos are so boring and long but this one has quality as well as ease of understanding..Thank you so much!

  • @SaiVivek15
    @SaiVivek15 2 ปีที่แล้ว

    This video is super informative! It gave me the actual perspective on regularization.

  • @fktx3507
    @fktx3507 2 ปีที่แล้ว

    Thanks, man. A really good and concise explanation of the approach (together with the video on Bayesian statistics).

  • @sudipanpaul805
    @sudipanpaul805 9 หลายเดือนก่อน +1

    Love you, bro, I got my joining letter from NASA as a Scientific Officer-1, believe me, your videos always helped me in my research works.

  • @dylanwatts4463
    @dylanwatts4463 3 ปีที่แล้ว +4

    Amazing video! Really clearly explained! Keep em coming!

    • @ritvikmath
      @ritvikmath  3 ปีที่แล้ว +1

      Glad you liked it!

  • @jlpicard7
    @jlpicard7 5 หลายเดือนก่อน

    I've seen everything in this video many, many times, but no one had done as good a job as this in pulling these ideas together in such an intuitive and understandable way. Well done and thank you!

  • @chenqu773
    @chenqu773 ปีที่แล้ว +5

    For me, the coolest thing about statistics is that every time I do a refresh on these topics, I get some new ideas or understandings. It's lucky that I came across this video after a year, which could also explain why we need to "normalized" the X (0 centered, with stdev = 1) before we feed them into the MLP model, if we use regularization terms in the layers.

  • @mohammadkhalkhali9635
    @mohammadkhalkhali9635 3 ปีที่แล้ว +2

    Man I'm going to copy-paste your video whenever I want to explain regularization to anyone! I knew the concept but I would never explain it the way you did. You nailed it!

  • @FRequena
    @FRequena 3 ปีที่แล้ว +1

    Super informative and clear lesson! Thank you very much!

  • @davidelicalsi5915
    @davidelicalsi5915 ปีที่แล้ว +1

    Brilliant and clear explanation, I was struggling to grasp the main idea for a Machine Learning exam but your video was a blessing. Thank you so much for the amazing work!

  • @umutaltun9049
    @umutaltun9049 ปีที่แล้ว

    It just blown my mind too. I can feel you brother. Thank you!

  • @rishabhbhatt7373
    @rishabhbhatt7373 ปีที่แล้ว

    Really good explanation. I really like how you gave context and connected all topics together and it make perfect sense. While maintaining the perfect balance b/w math and intution. Great worl. Thank You !

  • @swapnajoysaha6982
    @swapnajoysaha6982 2 หลายเดือนก่อน +1

    I used to be afraid of Bayesian Linear Regression until I saw this vid. Thank you sooo much

    • @ritvikmath
      @ritvikmath  2 หลายเดือนก่อน

      Awesome! Youre welcome

  • @antaresd1
    @antaresd1 8 หลายเดือนก่อน

    Thank you for this amazing video, It clarified many things to me!

  • @mateoruizalvarez1733
    @mateoruizalvarez1733 3 หลายเดือนก่อน

    Cristal clear! , thank you so much, the explanation is very structured and detailed

  • @sebastianstrumbel4335
    @sebastianstrumbel4335 3 ปีที่แล้ว +3

    Awesome explanation! Especially the details on the prior were so helpful!

    • @ritvikmath
      @ritvikmath  3 ปีที่แล้ว

      Glad it was helpful!

  • @icybrain8943
    @icybrain8943 3 ปีที่แล้ว +22

    Regardless of how they were really initially devised, seeing the regularization formulas pop out of the bayesian linear regression model was eye-opening - thanks for sharing this insight

    • @dennisleet9394
      @dennisleet9394 ปีที่แล้ว

      Yes. This really blew my mind. Boom.

  • @TejasEkawade
    @TejasEkawade 6 หลายเดือนก่อน

    This was an excellent introduction to Bayesian Regression. Thanks a lot!

  • @dodg3r123
    @dodg3r123 3 ปีที่แล้ว +2

    Love this content! More examples like this are appreciated

  • @chiawen.
    @chiawen. 8 หลายเดือนก่อน

    This is sooo clear. Thank you so much!

  • @ezragarcia6910
    @ezragarcia6910 ปีที่แล้ว

    Mi mente explotó con este video. Gracias

  • @qiguosun129
    @qiguosun129 2 ปีที่แล้ว

    Excellent tutorial! I have applied RIDGE as the loss function in different models.
    However, it is the first time I understand the mathematical meaning of lambda. It is really cool!

  • @benjtheo414
    @benjtheo414 9 หลายเดือนก่อน

    This was awesome, thanks a lot for your time :)

  • @Structuralmechanic
    @Structuralmechanic 4 หลายเดือนก่อน

    Amazing, you kept it simple and showed how regularization terms in linear regression originated from Bayesian approach!! Thank U!

  • @julissaybarra4031
    @julissaybarra4031 6 หลายเดือนก่อน

    This was incredible, thank you so much.

  • @AntonioMac3301
    @AntonioMac3301 2 ปีที่แล้ว

    This video is amazing!!! so helpful and clear explanation

  • @marcogelsomini7655
    @marcogelsomini7655 ปีที่แล้ว

    very cool the link you explained between regularization and prior

  • @FB0102
    @FB0102 ปีที่แล้ว

    truly excellent explanation; well done

  • @chuckleezy
    @chuckleezy 10 หลายเดือนก่อน

    you are so good at this, this video is amazing

    • @ritvikmath
      @ritvikmath  10 หลายเดือนก่อน

      Thank you so much!!

  • @javiergonzalezarmas8250
    @javiergonzalezarmas8250 ปีที่แล้ว

    Incredible explanation!

  • @alim5791
    @alim5791 2 ปีที่แล้ว

    Thanks, that was a good one. Keep up the good work!

  • @dirknowitzki9468
    @dirknowitzki9468 2 ปีที่แล้ว

    Your videos are a Godsend!

  • @JohnJones-rp2wz
    @JohnJones-rp2wz 3 ปีที่แล้ว +2

    Awesome explanation!

  • @user-or7ji5hv8y
    @user-or7ji5hv8y 2 ปีที่แล้ว +3

    This is truly cool. I had the same thing with the lambda. It’s good to know that it was not some engineering trick.

  • @juliocerono5193
    @juliocerono5193 2 หลายเดือนก่อน

    At last!! I could find an explanation for the lasso and ridge regression lamdas!!! Thank you!!!

    • @ritvikmath
      @ritvikmath  2 หลายเดือนก่อน

      Happy to help!

  • @joachimrosenberger2109
    @joachimrosenberger2109 ปีที่แล้ว

    Thanks a lot! Great! I am reading Elements of Statistical Learning and did not understand what they were talking about. Now I got it.

  • @curiousobserver2006
    @curiousobserver2006 ปีที่แล้ว

    This blew my mind.Thanks

  • @mohammadmousavi1
    @mohammadmousavi1 11 หลายเดือนก่อน

    Unbelievable, you explained linear reg, explained in simple terms Bayesian stat, and showed the connection under 20min .... Perfect

  • @brandonjones8928
    @brandonjones8928 2 หลายเดือนก่อน

    This is an awesome explanation

  • @millch2k8
    @millch2k8 ปีที่แล้ว

    I'd never considered a Bayesian approach to linear regression let alone its relation to lasso/ridge regression. Really enlightening to see!

  • @nirmalpatil5370
    @nirmalpatil5370 2 ปีที่แล้ว

    This is brillian man! Brilliant! Literally solved where the lamda comes from!

  • @caiocfp
    @caiocfp 3 ปีที่แล้ว +3

    Thank you for sharing this fantastic content.

    • @ritvikmath
      @ritvikmath  3 ปีที่แล้ว +1

      Glad you enjoy it!

  • @Maciek17PL
    @Maciek17PL ปีที่แล้ว

    You are a great teacher thank you for your videos!!

  • @juliocerono_stone5365
    @juliocerono_stone5365 2 หลายเดือนก่อน

    at last!!! Now I can see what lamda was doing in tne lasso and ridge regression!! great video!!

    • @ritvikmath
      @ritvikmath  2 หลายเดือนก่อน

      Glad you liked it!

  • @shantanuneema
    @shantanuneema 2 ปีที่แล้ว

    you got a subscriber, awesome explanation. I spent hours learning it from other source, but no success. You are just great

  • @mahdijavadi2747
    @mahdijavadi2747 2 ปีที่แล้ว

    Thanks a lottttt! I had so much difficulty understanding this.

  • @rmiliming
    @rmiliming ปีที่แล้ว

    Tks a lot for this clear explanation !

  • @amirkhoutir2649
    @amirkhoutir2649 ปีที่แล้ว

    thank you so much for the great explanation

  • @houyao2147
    @houyao2147 3 ปีที่แล้ว +1

    What a wonderful explanation!!

    • @ritvikmath
      @ritvikmath  3 ปีที่แล้ว

      Glad you think so!

  • @Life_on_wheeel
    @Life_on_wheeel 3 ปีที่แล้ว

    Thanks for video.. Its really helpful.. I was trying to understand how regularization terms are coming.. Now i got. Thanks ..

  • @dmc-au
    @dmc-au ปีที่แล้ว

    Wow, killer video. This was a topic where it was especially nice to see everything written on the board in one go. Was cool to see how a larger lambda implies a more pronounced prior belief that the parameters lie close to 0.

    • @ritvikmath
      @ritvikmath  ปีที่แล้ว

      I also think it’s pretty cool 😎

  • @SamuelMMuli-sy6wk
    @SamuelMMuli-sy6wk 2 ปีที่แล้ว

    wonderful stuff! thank you

  • @chenjus
    @chenjus 2 ปีที่แล้ว

    This is the best explanation of L1 and L2 I've ever heard

  • @samirelamrany5323
    @samirelamrany5323 ปีที่แล้ว

    perfect explanation thank you

  • @kennethnavarro3496
    @kennethnavarro3496 2 ปีที่แล้ว

    Thank you very much. Pretty helpful video!

  • @undertaker7523
    @undertaker7523 ปีที่แล้ว

    You are the go-to for me when I need to understand topics better. I understand Bayesian parameter estimation thanks to this video!
    Any chance you can do something on the difference between Maximum Likelihood and Bayesian parameter estimation? I think anyone that watches both of your videos will be able to pick up the details but seeing it explicitly might go a long way for some.

  • @matthewkumar7756
    @matthewkumar7756 2 ปีที่แล้ว

    Mind blown on the connection between regularization and priors in linear regression

  • @vipinamar8323
    @vipinamar8323 2 ปีที่แล้ว

    Great video with a very clear explanation. COuld you also do a video on Bayesian logistic regression

  • @narinpratap8790
    @narinpratap8790 3 ปีที่แล้ว +7

    Awesome video. I didn't realize that the L1, L2 regularization had a connection with the Bayesian framework. Thanks for shedding some much needed light on the topic. Could you please also explain the role of MCMC Sampling within Bayesian Regression models? I recently implemented a Bayesian Linear Regression model using PyMC3, and there's definitely a lot of theory involved with regards to MCMC NUTS (No U-Turn) Samplers and the associated hyperparameters (Chains, Draws, Tune, etc.). I think it would be a valuable video for many of us.
    And of course, keep up the amazing work! :D

    • @ritvikmath
      @ritvikmath  3 ปีที่แล้ว +2

      good suggestion!

  • @j29Productions
    @j29Productions 4 หลายเดือนก่อน

    You are THE LEGEND

  • @manishbhanu2568
    @manishbhanu2568 ปีที่แล้ว

    you are a great teacher!!!🏆🏆🏆

  • @kaartiki1451
    @kaartiki1451 2 หลายเดือนก่อน

    Legendary video

  • @axadify
    @axadify 2 ปีที่แล้ว

    such a nice explanation. I mean thats the first time I actually understood it.

  • @alexanderbrandmayr7408
    @alexanderbrandmayr7408 3 ปีที่แล้ว

    Great video!!

  • @louisc2016
    @louisc2016 2 ปีที่แล้ว

    fantastic! u r my savor!

  • @haeunroh8945
    @haeunroh8945 2 ปีที่แล้ว

    your videos are awesome so much better than my prof

  • @hameddadgour
    @hameddadgour ปีที่แล้ว

    Holy shit! This is amazing. Mind blown :)

  • @souravdey1227
    @souravdey1227 ปีที่แล้ว

    Can you please please do a series on categorical distribution, multinomial distribution, Dirichlet distribution, Dirichlet process and finally non parametric Bayesian tensor factorisation including clustering of steaming data. I will personally pay you for this. I mean it!!
    There are a few videos on these things on youtube, some are good, some are way high-level. But, no one can explain the way you do.
    This simple video has such profound importance!!

  • @abdelkaderbousabaa7020
    @abdelkaderbousabaa7020 2 ปีที่แล้ว

    Excellent thank you

  • @yulinliu850
    @yulinliu850 3 ปีที่แล้ว +1

    Beautiful!

    • @ritvikmath
      @ritvikmath  3 ปีที่แล้ว

      Thank you! Cheers!

  • @datle1339
    @datle1339 ปีที่แล้ว

    very great, thank you

  • @rachelbarnes7469
    @rachelbarnes7469 3 ปีที่แล้ว

    thank you so much for this

  • @Aviationlads
    @Aviationlads 7 หลายเดือนก่อน +1

    Great video, do you have some sources I can use for my university presentation? You helped me a lot 🙏 thank you!

  • @godse54
    @godse54 3 ปีที่แล้ว +1

    Nice i never thought that 👍🏼👍🏼

  • @chenqu773
    @chenqu773 3 ปีที่แล้ว

    Thank you very much

  • @jaivratsingh9966
    @jaivratsingh9966 ปีที่แล้ว

    Excellent

  • @julianneuer8131
    @julianneuer8131 3 ปีที่แล้ว +2

    Excellent!

    • @ritvikmath
      @ritvikmath  3 ปีที่แล้ว

      Thank you! Cheers!

  • @karannchew2534
    @karannchew2534 ปีที่แล้ว

    Notes for my future revision.
    *Priror β*
    10:30
    Value of Prior β is normally distributed. The by product of using Normal Distribution is Regularisation. Because the prior values of β won't be too large (or too small) from the mean.
    Regularisation keep values of β small.

  • @TK-mv6sq
    @TK-mv6sq ปีที่แล้ว

    thank you!

  • @shipan5940
    @shipan5940 ปีที่แล้ว +1

    Max ( P(this is the best vid explaining these regressions | TH-cam) )

  • @petmackay
    @petmackay 3 ปีที่แล้ว +4

    Most insightful! L1 as Laplacian toward the end was a bit skimpy, though. Maybe I should watch your LASSO clip. Could you do a video on elastic net? Insight on balancing the L1 and L2 norms would be appreciated.

    • @danielwiczew
      @danielwiczew 2 ปีที่แล้ว +1

      Yea, Elasticnet and comparison to Ridge/Lasso would be very helpful

  • @imrul66
    @imrul66 10 หลายเดือนก่อน

    Great video. The relation between the prior and LASSO penalty was a "wow" moment for me. It would be helpful to see actual computation example in python or R. A common problem I see in Bayesian lectures is - too much focus on math rather to show how actually/ how much the resulting parameters differs. Specially, when to consider bayesian approach over ols.

  • @yodarocco
    @yodarocco ปีที่แล้ว

    At the end I understand it too finally. A hint for peaple who also struggle on BR like me: do a Bayesian linear regression in Python from any tutorial that you find online, you are going to understand, trust me. I think that one of the initial problems for a person that face a Bayesian approach it’s the fact that you are actually obtaining a posterior *of weights*!. Now looks kinda obvious but at the beginning I was really stuck, I could not understand what was actually the posterior doing.

  • @ThePiotrekpecet
    @ThePiotrekpecet ปีที่แล้ว +1

    There is an error at the beginning of the video, in frequentist approaches X is treated as non random covariate data and y is the random part so the high variance of OLS should be expressed as small changes to y => big changes to OLS estimator.
    The changes to covariate matrix becoming big changes to OLS estimator is more like a non robustness of OLS wrt outlier contamination.
    Also the lambda should be 1/2τ^2 not σ^2/τ^2 since:
    ln(P(β))=-p * ln(τ * √2*π) - ||β||₂/2τ^2
    Overall this was very helpful cheers!

  • @jairjuliocc
    @jairjuliocc 3 ปีที่แล้ว +3

    Thank You , I saw this before but i didnt understand. Please , where can i find the complete derivation? And maybe You can do a complete series in this topic

  • @AnotherBrickinWall
    @AnotherBrickinWall 11 หลายเดือนก่อน

    Great thanks! .. was feeling the same discomfort about the origin of these...

  • @allanvieiradecastroquadros1391
    @allanvieiradecastroquadros1391 ปีที่แล้ว

    Respect! 👏

  • @andresilva9140
    @andresilva9140 ปีที่แล้ว +1

    Awesome

  • @AYANShred123
    @AYANShred123 2 ปีที่แล้ว

    Wonderfully explained! Mathematicians should be more subscribed to!

  • @JorgeGomez-kt3oq
    @JorgeGomez-kt3oq 8 หลายเดือนก่อน

    Great video, just a question, where can I get some example of the algebra?

  • @convex9345
    @convex9345 3 ปีที่แล้ว

    mind boggling

  • @vinceb8041
    @vinceb8041 3 ปีที่แล้ว +1

    Amazing! But where did Ridge and Lasso start from? Were they invented with Bayesian statistics as a starting point, or is that a duality that came later?