naive bayes classifier | Introduction to Naive Bayes Theorem | Machine Learning Algorithm (2019)

แชร์
ฝัง
  • เผยแพร่เมื่อ 25 ม.ค. 2025

ความคิดเห็น • 86

  • @GelsYT
    @GelsYT 5 ปีที่แล้ว +50

    I've seen a lot of naïve bayes explanations but this one is the best one and yet written in paper!
    THANK YOU SO MUCH. Thank you Indian guy, Indians are one of the most intelligent race in this planet

    • @theu.cgamer2923
      @theu.cgamer2923 4 ปีที่แล้ว

      His name is Adarsh he is my neighbour

  • @pepelevamp2752
    @pepelevamp2752 5 ปีที่แล้ว +4

    thank you dude. your approach of putting the words into the P(thing|thing) formulas is golden. ive watched a lot of videos trying to teach this concept and they just hit you over the head with the formula with little relevance to the variables and its awful. people are actually awful at teaching this. but you're actually doing it right. thank you.

  • @parulbhaskar6138
    @parulbhaskar6138 3 ปีที่แล้ว +3

    Very clear and detailed explanation. Thanks so much !

  • @shruthihn2320
    @shruthihn2320 5 ปีที่แล้ว +2

    Excellent explanation.Now I got to know that why we use Naive Bayes algorithm to text classification.

  • @alfinwijaya7688
    @alfinwijaya7688 4 ปีที่แล้ว +2

    2:14 I thought it was P(Sport|A very close game) and P(Non sport|A very close game) because if you write P(A very close game|Sport) based on the Bayes Theorem it will be P(Sport|A) P(Sport|very) P(Sport|close) P(Sport|game) . Not a big deal though just to clarify

  • @gayatritl371
    @gayatritl371 4 ปีที่แล้ว +1

    I loved ur voice... Explination tooo👌👌👌❤️

  • @saikumaryeruboina697
    @saikumaryeruboina697 5 ปีที่แล้ว +2

    Worth for watching
    Really great explanation...👏👏👏

  • @vipulvivek3874
    @vipulvivek3874 4 ปีที่แล้ว +2

    Exactly what I was looking for.

  • @keerumbs4605
    @keerumbs4605 4 ปีที่แล้ว +1

    Because of you we got 10 gpa in our final year project 🙏 ... Thanks anna..

  • @sudheerHum
    @sudheerHum 6 ปีที่แล้ว +5

    That was really nice...Can we expect whole Machine Learning series from your channel 😀

    • @CodeWrestling
      @CodeWrestling  6 ปีที่แล้ว

      yeah definitely.. we will be uploading the whole series soon.

  • @electricindro2236
    @electricindro2236 3 ปีที่แล้ว

    Very nice explanation.

  • @rebeccageorgiana565
    @rebeccageorgiana565 5 ปีที่แล้ว +1

    such a clarity in explanation, waiting for more Algol's

  • @infopedia_life_facts
    @infopedia_life_facts 5 ปีที่แล้ว

    Bro , you are great....Thanks for explaining NB-algorithm in such simple manner.

  • @chandinigowda8906
    @chandinigowda8906 4 ปีที่แล้ว +1

    Bro can u explain problems on this naives Bayes classifier please

  • @hemangdhanani9434
    @hemangdhanani9434 3 ปีที่แล้ว

    super explanation, hats off .. please make some more videos on ML concepts

  • @karolisaand
    @karolisaand 4 ปีที่แล้ว +1

    What a great explanation!

  • @siddhardharao6631
    @siddhardharao6631 3 ปีที่แล้ว

    Perfect explanation, do you have any physical code to share.

  • @sohelahmadm4918
    @sohelahmadm4918 3 ปีที่แล้ว

    you are handwriting so neat dude

  • @goyalnaman99
    @goyalnaman99 4 ปีที่แล้ว

    Distinct words in training or also including test?

  • @namenone8387
    @namenone8387 4 ปีที่แล้ว

    best explanation man thank you! you just earned a subscriber.

    • @CodeWrestling
      @CodeWrestling  4 ปีที่แล้ว

      Thanks for the sub! Stay tuned and also share with your friends

  • @honeyb9698
    @honeyb9698 3 ปีที่แล้ว

    I love your voice 😍 oh yes... And your explanations too!
    Thank you ❤️

  • @SohamisRock
    @SohamisRock 4 ปีที่แล้ว +1

    At the end where you were deducing which category the sentence falls in, you did not actually use the Bayes' formula. You only calculated the P(B|A) part, where B = A very close game, A = Sports/NonSports. Am I right? If not, pls clarify.
    Thanks.

  • @archanacr3171
    @archanacr3171 5 ปีที่แล้ว

    Thank u so much.....very nice explanation
    Please make more videos on ML

  • @LetsMazze
    @LetsMazze 5 ปีที่แล้ว

    That's needed...thank you so much...brother👏

  • @RealMatinMahmoudi
    @RealMatinMahmoudi 5 ปีที่แล้ว +7

    dude, you didn't calculate p( sport) and p (non-sport), you should multipy it at the final

    • @harunalperentoktas4983
      @harunalperentoktas4983 5 ปีที่แล้ว +4

      Yes you are right ,He did not multiply prior.There was mistake.P(Sport) = 3/5, P(NoSport) = 2/5 .If you would multiply you can do it right

    • @scxdb9848
      @scxdb9848 5 ปีที่แล้ว

      Yes prior probability is not multiplied

    • @ShimmerCloudz
      @ShimmerCloudz 3 ปีที่แล้ว

      A posterior algorithms says. P(A|B) = P(B|A) * P(A) / P(B).
      Then P(A very close game|Sport) should be
      P(A very close game|Sport) = P(Sport|A) P(Sport|very) P(Sport|close) P(Sport|game) * P(Sports) / P(A very close game).
      Please clarify this point

  • @_curiosity...8731
    @_curiosity...8731 4 ปีที่แล้ว

    The P(a | sports) should be 2/3. Please correct me if I am wrong.

  • @leekhajindal1700
    @leekhajindal1700 5 ปีที่แล้ว +1

    dear u r very genious . keep it up. pls help me how to implement the naive example u taught us in python. pls tell me with the same example how to implement.

  • @akankshapandey2561
    @akankshapandey2561 5 ปีที่แล้ว

    i wanted to ask why have u used the laplace smoothing for non zero frequencies also? please explain

  • @ccuuttww
    @ccuuttww 4 ปีที่แล้ว

    your final answer miss the prior?

  • @supriyapejavara
    @supriyapejavara 6 ปีที่แล้ว

    Thank you so much sir...I was searching for the explanation of this algorithm from long time..Very well explained👍It helped me a lot😊

    • @CodeWrestling
      @CodeWrestling  6 ปีที่แล้ว +1

      Thank you Supriya.... Stay tuned with us.. We will try to help as much as we can....

    • @vidyashirodkar9686
      @vidyashirodkar9686 6 ปีที่แล้ว +1

      Kudos to u sir ....a big 🙏.....plze provide other ML algorithms explanation!!

  • @sandeeppanchal8615
    @sandeeppanchal8615 5 ปีที่แล้ว

    Very nicely explained @Code Wrestling. But I doubt your explanation on Laplace Smoothing. Alpha is not mainly 1. It can be 1, 10, 100, 10000. And 'd' is not the number of distinct words but distinct categories i.e 2 (sports and non-sports). Correct me if I am wrong.

  • @SACHINKUMARSINGH01
    @SACHINKUMARSINGH01 5 ปีที่แล้ว

    Awesome. Hats off to you.

  • @swetharanik1822
    @swetharanik1822 6 ปีที่แล้ว

    Gud explanation.....please post videos about bayesian learning

  • @unnatiagarwal2306
    @unnatiagarwal2306 5 ปีที่แล้ว

    Thankyou so much for this video.
    It helped a lottt!!♥️

  • @anggun3170
    @anggun3170 4 ปีที่แล้ว

    Thank you for the explanation.

  • @karthik95abi
    @karthik95abi 4 ปีที่แล้ว

    But probability of sports is not multiplied which should be done according to Naive bayes likelihood * prior. I don't know why prior probability is missed

  • @robertjames7014
    @robertjames7014 5 ปีที่แล้ว +2

    You are very smart, your video was informative. A little constructive criticism, If you are going to teach something in English and you have a foreign accent, you have to slow it down when you speak.

    • @CodeWrestling
      @CodeWrestling  5 ปีที่แล้ว

      Thanks a lot!! I will surely work on your suggestion.

  • @thoughtwave5130
    @thoughtwave5130 6 ปีที่แล้ว +1

    Coneptual Q : Does N Bayes assume independence. Because i guess you can only apply the formula if the events are independent. But in real world ebents are not really independent.

    • @CodeWrestling
      @CodeWrestling  6 ปีที่แล้ว

      Yeah it assumes the events to be independent. For the dependent events we have other classifiers. #codewrestling

    • @thoughtwave5130
      @thoughtwave5130 5 ปีที่แล้ว

      @@CodeWrestling Are they also probabilistc? anyexamples

  • @dhananjayhawal3102
    @dhananjayhawal3102 4 ปีที่แล้ว +1

    Great explanation! But you should have multiplied by P(Sports) or P(No Sports) to respective formula

  • @malthehansen7915
    @malthehansen7915 5 ปีที่แล้ว +1

    You need to work at a UNI.
    You make me proud to be indian bhai, seriously.
    Clear, step by step explanation while mitigating unnecessary info and focusing on fundamentals.
    You have gained a subscriber.

  • @prashantkher8580
    @prashantkher8580 4 ปีที่แล้ว

    Why you have not multiplied the probability of Sports and probability of Non-Sports to the final result

  • @shashikantchaudhary3022
    @shashikantchaudhary3022 5 ปีที่แล้ว +1

    Please upload more videos

  • @jasonschmidt1984
    @jasonschmidt1984 2 ปีที่แล้ว

    Why is it 11 (total words in the category) instead of 10 (unique words in the category)? Seems like the probability of a word being in the sentence is all about unique words.

  • @azharmahmood1323
    @azharmahmood1323 4 ปีที่แล้ว

    anyone have link to download the ppt of this video.

  • @meherunnisa9674
    @meherunnisa9674 5 ปีที่แล้ว +1

    Gud work. Please, share the video for Hunt’s Algorithm/Decision Tree as well

  • @udupi123456
    @udupi123456 5 ปีที่แล้ว +1

    good video

  • @MrParveenkumar007
    @MrParveenkumar007 5 ปีที่แล้ว

    really good one

  • @aparnasatya4099
    @aparnasatya4099 6 ปีที่แล้ว

    can you make a video on instance learning?

    • @CodeWrestling
      @CodeWrestling  6 ปีที่แล้ว

      Yeah definitely stay tuned with #codewrestling

  • @ตฤนจักคงธรรมกุล-ฦ9ฑ

    really great

    • @CodeWrestling
      @CodeWrestling  5 ปีที่แล้ว

      Thanks for appreciating!! #CodeWrestling

  • @JonitoFischer
    @JonitoFischer 5 ปีที่แล้ว +1

    Word "a" appears 3 times in your text sentences (2 as first word, but that is irrelevant) so P("a") is 3/11

    • @66Horses66Lover66
      @66Horses66Lover66 5 ปีที่แล้ว +1

      I think you only count 'A's in the sports category, because you are looking for the probability of 'A' being in the sports category.

    • @JonitoFischer
      @JonitoFischer 5 ปีที่แล้ว

      @@66Horses66Lover66 yes you're right... There are 11 words in total in sentences that belongs to the sports category... My bad!

  • @ushadevialankar6595
    @ushadevialankar6595 3 ปีที่แล้ว

    Thnku bro....

  • @meerasarna6862
    @meerasarna6862 4 ปีที่แล้ว

    make more videos please.....

  • @realeques
    @realeques 6 ปีที่แล้ว +5

    pro bubbly tea

  • @pajjukadiyavar7595
    @pajjukadiyavar7595 5 ปีที่แล้ว

    Nice

  • @ashwinbhaskar007
    @ashwinbhaskar007 5 ปีที่แล้ว

    Thanks bro

  • @ichirag1362
    @ichirag1362 6 ปีที่แล้ว

    BEST..

  • @sushilchauhan2586
    @sushilchauhan2586 5 ปีที่แล้ว

    can you pls cover a knn like that naive videos with code explanation ...i think you are a busy man ...just explain it on jupyter with pdf there is no need of webcam bhai ..pls _||_
    pls bhai
    pls

  • @sandipankarmakar
    @sandipankarmakar 5 ปีที่แล้ว

    It's Bayes' Theorem... Not Baye's Theorem...

  • @yeeteshpulstya9890
    @yeeteshpulstya9890 6 ปีที่แล้ว +3

    VTU student? xD