Machine Intelligence - Lecture 7 (Clustering, k-means, SOM)

แชร์
ฝัง
  • เผยแพร่เมื่อ 12 ธ.ค. 2024

ความคิดเห็น • 32

  • @isabelmateus2547
    @isabelmateus2547 3 ปีที่แล้ว +77

    Intro to Clustering 0:27
    K-Means: 11:17
    SOM: 42:00

  • @intoeleven
    @intoeleven 4 ปีที่แล้ว +52

    42:00 starts to talk about SOM

    • @jm-px3mr
      @jm-px3mr 4 ปีที่แล้ว +1

      Thank you man

    • @nmana9759
      @nmana9759 4 ปีที่แล้ว

      Thank you!

  • @iliasp4275
    @iliasp4275 4 ปีที่แล้ว +15

    Before coming here, i saw about 5 videos on SOM. No one pointed out that the algorithm is the same as K-means . You enlightened me! thank you very much

  • @Simzreid
    @Simzreid 4 ปีที่แล้ว +12

    Absolutely brilliant. Clear, concise, flowing and enlightening. A great help in understanding SOFMs research I am thinking about doing. 👍

  • @ravivarma5703
    @ravivarma5703 5 ปีที่แล้ว +48

    this is the best explanation i have ever found. Please is there any way i can see more lectures from this professor in any other channels?

  • @MrQqqq2222
    @MrQqqq2222 4 ปีที่แล้ว +6

    Wonderful professor. I can follow with him even if i am so far from ML field. I start to love Mr hamid and also AI methods and techniques. Thanks a lot my favorite virtual teacher.

  • @subramaniamsrivatsa2719
    @subramaniamsrivatsa2719 4 ปีที่แล้ว +5

    My humble regards to Professor. For so much simplifying complex concepts and explaining intution behind the algorithms ...and encouraging us to understand 🙏

  • @karannchew2534
    @karannchew2534 ปีที่แล้ว

    SOM 40:39
    1:14:25
    Given input X, find i-th unit with closest weight vector by competition.
    WiT X will be maximum.
    Find the most similar unit.
    i(X) = arg max i Ⅱ X - Wk Ⅱ
    k = 1, 2, 3... m, m = no. of units.
    The "max" here means highest value of dot product. The most "aligned" set of vectors between input vector and the neuron vector. If the vector are misaligned, the dot product (think cos θ) might be zero.

  • @rezamonadi4282
    @rezamonadi4282 4 ปีที่แล้ว +5

    I am really proud of you! You explained SOM like an exciting journey...

  • @ab8jeh
    @ab8jeh 5 ปีที่แล้ว +5

    I like the way he explains things very clearly. Within machine learning there is a tendency to cloud things to make oneself seem more intelligent - this lecturer shows how simple some of these algorithms (and ML in general) truly are without dumbing things down.

  • @joecrowley630
    @joecrowley630 5 ปีที่แล้ว +8

    So good, loved the enthusiasm and A-class white board usage of the lecturer. Thank you so much for sharing.

  • @derollo3
    @derollo3 4 ปีที่แล้ว +5

    Excellent Lecture about clustering. Thank you very much for sharing your knowledge.

  • @judealaitani1036
    @judealaitani1036 4 ปีที่แล้ว +7

    Amaaaaazing teaching skills!

  • @Mark-wb8ck
    @Mark-wb8ck 4 ปีที่แล้ว +6

    SOM starts at 40:42

  • @isuruvindula2346
    @isuruvindula2346 4 ปีที่แล้ว +2

    Sir, you are a lifesaver sir. Thank you very much.

  • @karannchew2534
    @karannchew2534 ปีที่แล้ว

    Given input X, find i-th unit with closest weight vector by competition.
    WiT X will be maximum.
    Find the most similar unit.
    i(X) = arg max i Ⅱ X - Wk Ⅱ
    k = 1, 2, 3... m, m = no. of units.
    The "max" here means highest value of dot product. The most "aligned" set of vectors between input vector and the neuron vector. If the vector are misaligned, the dot product (think cos θ) might be zero.

  • @karannchew2534
    @karannchew2534 ปีที่แล้ว

    Very neat handwriting for a professor.

  • @burakkara337
    @burakkara337 7 หลายเดือนก่อน

    Professor says, "Nobody screams when I make mistakes". I went crazy on monitor nobody hears me :D 28:32

  • @Birdsneverfly
    @Birdsneverfly 4 ปีที่แล้ว +3

    Let me take a moment to admire your handwriting :) Plus "You are becoming your data", this has to be a dialogue from an AI movie. Cheers :)

  • @mohammadghorbani188
    @mohammadghorbani188 5 ปีที่แล้ว

    Very good lecture. Thanks for sharing.

  • @jm-px3mr
    @jm-px3mr 4 ปีที่แล้ว

    Thanks for sharing the amazing lecture. I wish to take a class in real someday

  • @basics7930
    @basics7930 4 ปีที่แล้ว

    great explanation

  • @sara-ja
    @sara-ja 4 ปีที่แล้ว

    perfect ...thanks

  • @homataha5626
    @homataha5626 5 ปีที่แล้ว +3

    does anyone know implementation of SOM in python 3?
    in all the code I have seen they always use targets but we don't have it what should we do?

    • @pechhase
      @pechhase 4 ปีที่แล้ว +1

      you could check Peter Wittek's somoclu library

  • @venkateswaranvenkatraman9630
    @venkateswaranvenkatraman9630 4 ปีที่แล้ว

    why people are taking Mahalano distance , then eculidian distance ?

  • @yangzuo6973
    @yangzuo6973 4 ปีที่แล้ว +2

    Who is the other scientist who has got new ideas?

    • @Reggelord
      @Reggelord 4 ปีที่แล้ว +2

      "Norbert Wiener and Albert Einstein"

  • @yeyez3149
    @yeyez3149 4 ปีที่แล้ว

    lecture notes?