Machine learning - Regularization and regression

แชร์
ฝัง
  • เผยแพร่เมื่อ 17 พ.ย. 2024

ความคิดเห็น • 14

  • @misttborn
    @misttborn 5 ปีที่แล้ว +12

    this professor is amazing

  • @JaysonSunshine
    @JaysonSunshine 7 ปีที่แล้ว +4

    If you watch the next video in the playlist, you see there appears to be little content missing from this video. The next video begins with a review and then a quick addition of the demonstration of the usage of RBFs still results in a linear squares problem (pages 11 and 12 of the corresponding class notes).

  • @ranga1288
    @ranga1288 11 ปีที่แล้ว +2

    Thanks for uploading it Professor de Freitas !

  • @ClaudiuB
    @ClaudiuB 9 ปีที่แล้ว +5

    Hy Nando, where is the continuation of this video? thank you!

  • @ahmed_mohammed_1
    @ahmed_mohammed_1 2 ปีที่แล้ว

    I wish if i discovered your courses a bit earlier

  • @franklyndsouza8983
    @franklyndsouza8983 10 ปีที่แล้ว +4

    The video cuts off unexpectedly at the end.

  • @DClover86
    @DClover86 8 ปีที่แล้ว +1

    Great lecture!

  • @mrf145
    @mrf145 10 ปีที่แล้ว

    Nando if you have given a lecture on Support Vector Machine...Can you kindly share it too.

  • @BruinChang
    @BruinChang 2 ปีที่แล้ว +1

    Does anyone have an intuitive idea about why Shannon entropy takes the order of a probability as the weight of the probability itself? Is that related to softmax?

    • @kurienabraham8739
      @kurienabraham8739 2 ปีที่แล้ว +1

      Entropy is expected value of surprise. SO less likely events will have high surprise . That is why inverse of a proability of an event is taken as a measure of surprise. And for practical reasons we take log of the inverse of the probability. Now we calculate the expected value of the surprises to get the entropy. Expected value is nothing but a weighted average wherein weights are probabilities themselves.

    • @BruinChang
      @BruinChang 2 ปีที่แล้ว +1

      @@kurienabraham8739 Thank you.

    • @kurienabraham8739
      @kurienabraham8739 2 ปีที่แล้ว

      @@BruinChang Never mind. I learnt it from here - th-cam.com/video/YtebGVx-Fxw/w-d-xo.html

  • @alexsteed3091
    @alexsteed3091 4 ปีที่แล้ว

    The curve in 50:20. Is it always a single line. I feel like a line made by two (2D) spheres will be two curves. E.g. looks like you draw an eye.

  • @sisyphus_619
    @sisyphus_619 6 ปีที่แล้ว

    the lecture is missing some content at the end after "how do we compute ...."