Maximum Likelihood : Data Science Concepts

แชร์
ฝัง
  • เผยแพร่เมื่อ 2 ม.ค. 2025

ความคิดเห็น • 63

  • @llmstr
    @llmstr 25 วันที่ผ่านมา +1

    great stuff man. the way you combine the maths and the explanations in a not too dumb-it-down style makes this channel truly a gem. thanks a lot

  • @Aruuuq
    @Aruuuq 10 หลายเดือนก่อน +5

    Undeniably you're creating some of the best videos concerning statistics out there. And this is another one. Thank you so much

  • @bannerlad01
    @bannerlad01 3 ปีที่แล้ว +29

    You are a brilliant teacher - thanks so much for doing this

  • @abhishekchandrashukla3814
    @abhishekchandrashukla3814 5 หลายเดือนก่อน +1

    I swear to god, when I was searching for maximum likelihood estimation, I was hoping you would make a video on this, scrolled down a bit and bingo!! I see Rithvik Math. My happiness knows no bounds. Thank you for existing.

  • @xxshogunflames
    @xxshogunflames 3 ปีที่แล้ว +6

    The collection of info *chefs kiss*

  • @DaquanMHall
    @DaquanMHall 2 ปีที่แล้ว +1

    Man I watch your videos all the time. I can write the code and understand the outcome but you’re the only way I can understand the math. Thanks!

  • @corybeyer1
    @corybeyer1 3 ปีที่แล้ว +12

    this is art

  • @KareemMnour
    @KareemMnour 3 ปีที่แล้ว +1

    Thank you so much for preferring to actually help people understand concepts rather than throwing fancy multi-step jargon that gets people frustrated at math topics. I would do anything I can to help keep the channel alive and I will recommend your channel to all my friends and colleagues. Thanks again and keep the excellent work.

  • @mehtipsz
    @mehtipsz 3 ปีที่แล้ว +4

    As always, great videos! I mainly use them as supplement to masters level courses. What I love are the parts where you cover the intuitions about the formulas, it makes them so much more understandable. Keep up the good work!

  • @ИльяХренков-б8я
    @ИльяХренков-б8я 3 ปีที่แล้ว +7

    Love your videos! Very nice to revise + learn new things, not missing on intuition either. Hope, your followers number will soar soon.

  • @moravskyvrabec
    @moravskyvrabec ปีที่แล้ว +2

    Great stuff. I'm taking an online MIT class. Complicated topic? I come to your channel to solidify my understanding!

    • @ritvikmath
      @ritvikmath  ปีที่แล้ว

      Glad it was helpful!

  • @dataman1000
    @dataman1000 2 ปีที่แล้ว +1

    Just Brilliant! thanks for demystifying logistic regression equations for me🤝

  • @omniscienceisdead8837
    @omniscienceisdead8837 2 ปีที่แล้ว +2

    this was a very beautiful lecture

  • @rmiliming
    @rmiliming 2 ปีที่แล้ว +1

    excellently explained. Very clear and logical! Tks !

  • @TheScawer
    @TheScawer 3 ปีที่แล้ว +1

    Thank you for the video! I wanted to say they are great for revision, but I usually learn a lot more than I did in school on the topic... so thank you!

  • @alessandro5847
    @alessandro5847 3 ปีที่แล้ว +1

    Thanks for these lectures. You're great at explaining this stuff. Keep it up!

  • @MariaBarcoj
    @MariaBarcoj ปีที่แล้ว +1

    Thanks for making things seem to be quite simpler ☺

  • @yashpundir2044
    @yashpundir2044 3 ปีที่แล้ว +1

    Just 3K views on this? people are crazy. This deserves wayyyy more.

  • @dilinijayasinghe8134
    @dilinijayasinghe8134 7 หลายเดือนก่อน

    Thank you very much. Been struggling to get the intuition of MLE and you helped me to understand it. Would be awesome if you could do a video on GMM estimation. Thank you!!!

  • @yerzhant701
    @yerzhant701 2 ปีที่แล้ว +2

    Isn't likelihood should be inverse to probability P(y|x,beta), i.e. L(beta|x,y)?

    • @kisholoymukherjee
      @kisholoymukherjee 11 หลายเดือนก่อน

      exactly my thoughts. From what I read from other sources, Likelihood is given by L(parameters or distribution|observed data). Perhaps @ritvikmath can explain better

  • @shubhampandilwar8448
    @shubhampandilwar8448 3 ปีที่แล้ว

    very well explained. I am gaining confidence by these fundamentals lectures.

  • @yodarocco
    @yodarocco ปีที่แล้ว +1

    Have you ever done a video on Maximum a Posteriori (MAP)?

  • @bobby2636
    @bobby2636 ปีที่แล้ว

    Question: In 8:03, you're introducing the conceptions of the likelihood, which from my understanding is the probability of real observation emerging given the y; but in the formula, it looks like the posterior probability, not likelihood, is there something missing?

  • @yulinliu850
    @yulinliu850 3 ปีที่แล้ว

    Thanks for the great lecture. I really liked the word "seeing" outcomes.

  • @fszhang9010
    @fszhang9010 3 ปีที่แล้ว +1

    Great & helpful video ! From 12:40 says : "This's the probability of seeing all the real world outcomes that i actually see in my data". I think its better to replace "real world" with "predicted" or other synonym since the former kind of mislead the viewers to think those "outcomes" are recording of happened event which actually not, they stem from the model prediction, it's the "data"(y:nx1) that recorded the real result. and from 20:05 that's the correct way to express it.

  • @NickKravitz
    @NickKravitz 3 ปีที่แล้ว +1

    In English most people use the terms Probability and Likelihood interchangeably - I can't help but correct this when I hear it. One nuance is that the Maximum Likelihood result is often very small, meaning the parameter value isn't very likely, it is just more likely than the alternatives. Ranked Choice Voting is designed to promote the Most Likable Choice past the 50% threshold. Great video as always; I hope you become a stats and data science professor!

  • @kaym2332
    @kaym2332 3 ปีที่แล้ว

    Amazing style of teaching. Thank you!

  • @oneclickdiy
    @oneclickdiy 3 ปีที่แล้ว

    thank you ! these videos are good refresher

  • @jakobforslin6301
    @jakobforslin6301 3 ปีที่แล้ว

    You're an awesome teacher, thanks a lot!

  • @arshadkazi4559
    @arshadkazi4559 2 ปีที่แล้ว +1

    excellent explanation, very good as an introduction. Can you make something which delves into maths even more? Explanation of the last part it necessary and would be fun to understand. :)

  • @jansanda544
    @jansanda544 3 ปีที่แล้ว +1

    Amazing video. But during the whole time, I was distracted by figuring out what number series is on the tattoo. :D

  • @amjedbelgacem8218
    @amjedbelgacem8218 2 ปีที่แล้ว

    This guy makes Machine Learning easy bro, subscribed

  • @tianhelenaa
    @tianhelenaa 2 ปีที่แล้ว

    This is truly amazing!!!

  • @aminmohammadigolafshani2015
    @aminmohammadigolafshani2015 2 ปีที่แล้ว +1

    Amazing! Amazing! thanks a lot

  • @harshads885
    @harshads885 3 ปีที่แล้ว

    In the logistic regression part on the left, its probably better to callout that probability p is not the same as number of data points p.

  • @bryany7344
    @bryany7344 3 ปีที่แล้ว +1

    Can I know what is the difference between log likelihood vs negative log likelihood graphically ? How do I choose which of the loss functions?

  • @ChocolateMilkCultLeader
    @ChocolateMilkCultLeader 3 ปีที่แล้ว

    keep putting out your bamgers. Use them to learn how to communicate concepts. Shared this one with my network

  • @BehrouzShamsaei
    @BehrouzShamsaei ปีที่แล้ว

    Thanks for the video, will you be able to guide to a reference about why EM converges to a maximum, either local or global?

  • @jijie133
    @jijie133 2 ปีที่แล้ว

    I love your videos.

  • @prateekcaire4193
    @prateekcaire4193 8 หลายเดือนก่อน

    What should be the probability P(y_i| x_i, beta) where actual y_i is reject(= 0). If P(y_i| x_i, beta) is close to 0 or 0. Max Likelihood estimator will not be max even though beta parameters are fine tuned.

  • @TheRish123
    @TheRish123 3 ปีที่แล้ว

    What a guy! Amazing stuff

  • @goelnikhils
    @goelnikhils 2 ปีที่แล้ว

    Amazing Content. Thanks

  • @maneechotunpuang5299
    @maneechotunpuang5299 3 ปีที่แล้ว +1

    Your videos are absolutely helpful!! You're soooo damn good teacher and really good at delivering the complicated lessons into the easier way to digest. I hope I can pass this semester with your videos bc without your video it would be even worse! 😂 THANK A MILLION ❤️

  • @fatriantobong8169
    @fatriantobong8169 ปีที่แล้ว

    Hmmm how u bind the Yi to the sigmoid function..

  • @montycardman2535
    @montycardman2535 ปีที่แล้ว

    would the likelihood function be between 0 - 1?

  • @Whatever20237
    @Whatever20237 ปีที่แล้ว

    WOW! Thank you!

  • @hameddadgour
    @hameddadgour 2 ปีที่แล้ว +1

    The Gods of Statistics finally decided to send us a prophet :)

    • @Leila0S
      @Leila0S 8 หลายเดือนก่อน +1

      100%
      He us a magician
      I don’t understand how smoothly he makes the concept sweep into one’s brain

  • @robertpollock8617
    @robertpollock8617 ปีที่แล้ว

    I am confused. You are saying the probability and likelihood are the same according to what you have written by your equations. For likelihood are you not trying to say given the acceptance into med school the likelihood of having these values for gpa, mcat score etc…For instance if probability is P(y|x) then likelihood is L(x|y)? You have these two being equal.

  • @Mewgu_studio
    @Mewgu_studio 2 ปีที่แล้ว

    Gold!

  • @fotiskamanis8592
    @fotiskamanis8592 ปีที่แล้ว

    Thank you!!!

  • @asadkhanbb
    @asadkhanbb 3 ปีที่แล้ว

    Wow that t-shirt ❣️❣️❣️ cool 😎

  • @ling5544
    @ling5544 2 ปีที่แล้ว

    When the derivative is 0, it could also be a local minimum right? How to assure when the derivative is 0 then the likelihood is maximized?

    • @ritvikmath
      @ritvikmath  2 ปีที่แล้ว

      while it's true that derivative=0 could mean min or max, we can distinguish since a min has a decreasing gradient on the left and increasing gradient on the right. a max is the opposite. hope that helps!

    • @ling5544
      @ling5544 2 ปีที่แล้ว

      @@ritvikmath thanks! I got it.

  • @ianstats97
    @ianstats97 ปีที่แล้ว

    Great video, just did not understand where the sigma came from?

  • @ireoluwaTH
    @ireoluwaTH 3 ปีที่แล้ว +1

    Neat 'Mathematese'...

  • @redherring0077
    @redherring0077 3 ปีที่แล้ว

    Please marry me😍😂😂. I can listen to you forever. Such a passionate teacher!

  • @akashswain7939
    @akashswain7939 9 หลายเดือนก่อน

    L