ไม่สามารถเล่นวิดีโอนี้
ขออภัยในความไม่สะดวก

Introduction to Machine Learning - 06 - Linear discriminant analysis

แชร์
ฝัง
  • เผยแพร่เมื่อ 2 ส.ค. 2024
  • Lecture 6 in the Introduction to Machine Learning (aka Machine Learning I) course by Dmitry Kobak, Winter Term 2020/21 at the University of Tübingen.

ความคิดเห็น • 20

  • @saketdeshmukh6881
    @saketdeshmukh6881 2 ปีที่แล้ว +11

    I wish I had found this before my masters. intuitive with right amount of mathematical rigor.

  • @YuchengLin
    @YuchengLin 2 ปีที่แล้ว +3

    So wonderfully presented! Whenever I started to feel there was much math, some cute drawings appeared to give me simple and visceral intuition.

  • @micahdelaurentis6551
    @micahdelaurentis6551 3 ปีที่แล้ว +2

    These have been excellent videos so far

  • @TheCrmagic
    @TheCrmagic 2 ปีที่แล้ว +3

    Sir, You are a great teacher.

  • @AD-ox4ng
    @AD-ox4ng 11 หลายเดือนก่อน +1

    This is my guess for the number of parameters (in the covariance matrix alone) at 38:16:
    Full - p^2 (There are p*p distinct elements)
    Diagonal - p (There are only p distinct elements along diagonal, all else is 0)
    Spehrical - 1 (Same as diagonal but equal variance in all dimensions, so only one number to compute)
    If the model is separate, multiply the number above by 2, otherwise 1.
    Add 2p to account for the mean vectors as well. (There are p distinct means to calculate for each of the two classes)

  • @IamMoreno
    @IamMoreno 2 ปีที่แล้ว +1

    simply beautifully explained, sir you have all my gratitude

  • @woodworkingaspirations1720
    @woodworkingaspirations1720 ปีที่แล้ว +1

    This solved my problem. Thank you sir. Needed a summarized view of the math. Perfect.

  • @jiajieli5138
    @jiajieli5138 2 ปีที่แล้ว

    Highly recommended Machine Learning Instruction!

  • @xiaochelsey880
    @xiaochelsey880 ปีที่แล้ว

    Great video. Thank you so much for showing all the math!

  • @vincentole
    @vincentole 3 ปีที่แล้ว

    Great videos! Thank you for this.

  • @calcifer7776
    @calcifer7776 2 ปีที่แล้ว

    this is gold, thank you

  • @CootiePruitt
    @CootiePruitt 2 ปีที่แล้ว

    👍 Great video - thank you!

  • @severian6879
    @severian6879 ปีที่แล้ว

    Excellent explaination! Thank u very much!

  • @nauraizsubhan01
    @nauraizsubhan01 3 ปีที่แล้ว

    Sir can you please tell
    Does this course offers any course related to robotics and autonomous systems, during the program.

  • @hfz.arslan
    @hfz.arslan 3 ปีที่แล้ว

    Sir can you please share the slides or notes thanks

  • @sunshinebabe6203
    @sunshinebabe6203 3 ปีที่แล้ว

    Thank you! :)

  • @Jeremy-zs3nn
    @Jeremy-zs3nn 3 ปีที่แล้ว

    Thanks for posting - very helpful video. I did get a bit confused with some of the notation. Looking at the slide titled estimating gaussian parameters (25:49) - the covariance matrix we're estimating is indexing over Ck which is the subset of the design matrix for which Y=k? are X and mu_k both matrixes or is mu_k a vector?

    •  3 ปีที่แล้ว +2

      Thanks. Let me see... x_i is a vector (sample number i). mu_k is a vector (average over all samples belonging to class k, so with Y=k). Sigma_k is a matrix (covariance matrix over all samples belonging to class k). I usually use lowercase bold for vectors and uppercase bold for matrices.

    • @Jeremy-zs3nn
      @Jeremy-zs3nn 3 ปีที่แล้ว

      @ great, thank you for the quick reply!

  • @indigod3323
    @indigod3323 3 ปีที่แล้ว +1

    Very great teacher, I wish I could study in Tubingen