Matrix Norms : Data Science Basics

แชร์
ฝัง
  • เผยแพร่เมื่อ 23 ธ.ค. 2024

ความคิดเห็น • 54

  • @florawang7603
    @florawang7603 4 ปีที่แล้ว +13

    Thank you this is the clearest video on matrix norms I've watched so far

    • @ritvikmath
      @ritvikmath  4 ปีที่แล้ว

      Wow, thank you!

    • @dansantner
      @dansantner 3 ปีที่แล้ว

      Agree. I have been watching Strang's lectures and he skips so many conceptual steps sometimes it's hard to follow. This filled in the gaps.

  • @mosca-tse-tse
    @mosca-tse-tse 3 ปีที่แล้ว +13

    We’re saying “OK, matrix, you’re allowed to have...” 🤣🤣🤣🤣 so stressful for her 🤣🤣

  • @hba3415
    @hba3415 3 ปีที่แล้ว +1

    I was struggling so hard and finding anything on the internet. Thank God I found your Video.

  • @jtm8514
    @jtm8514 3 ปีที่แล้ว +6

    You make studying fun! Thank you so much, I loved watching this. It wasn't a chore after a bit, I was in bliss from how cool the math was.

  • @angelmcorrea1704
    @angelmcorrea1704 4 ปีที่แล้ว +13

    Thanks for all this lectures, very clear.

  • @teodorvijiianu41
    @teodorvijiianu41 ปีที่แล้ว

    I swere to god I watched the uni lecture 3 times i had no ideea what they were talking about. In less than 10 minutes it now makes sense. Thank you!

  • @sofiyavyshnya6723
    @sofiyavyshnya6723 4 ปีที่แล้ว +6

    Amazing video! Super brief and to the point and super clear! Thanks so much for all your help!

  • @jacob_dmn
    @jacob_dmn 3 ปีที่แล้ว

    This Channel Changed My way of thinking.. THANK YOU MAN

  • @juneshgautam8655
    @juneshgautam8655 ปีที่แล้ว +1

    4:22 Could you please provide me the information on how could I find a proof for that?

  • @haimteicherteicher4227
    @haimteicherteicher4227 3 ปีที่แล้ว +1

    very focused and clear explanation, much appreciated.

    • @ritvikmath
      @ritvikmath  3 ปีที่แล้ว

      Glad it was helpful!

  • @tyflehd
    @tyflehd 3 ปีที่แล้ว +2

    Thank you for your clear and intuitive descriptions :)

    • @ritvikmath
      @ritvikmath  3 ปีที่แล้ว

      You're most welcome!

  • @Posejdonkon
    @Posejdonkon ปีที่แล้ว

    Applicable and accent-free study material. Greatly appreciated!

  • @emmanuelamankwaaadjei3051
    @emmanuelamankwaaadjei3051 3 ปีที่แล้ว

    Precise yet detailed explanation. Great work.

  • @youlongding
    @youlongding 3 ปีที่แล้ว +2

    could you give me the detaied discussion of that property mentioned

  • @kamelismail3730
    @kamelismail3730 3 ปีที่แล้ว

    this is an absolute gem!

  • @ЕгорРудица
    @ЕгорРудица 4 ปีที่แล้ว +2

    Thank you bro! Huge Respect from Ukraine!

  • @srs.shashank
    @srs.shashank 2 ปีที่แล้ว

    This would be applicable for only square matrices right? How we calculate 2-norms for rectangular(non-square) matrices, Thanks!

  • @邱越-n6u
    @邱越-n6u 3 ปีที่แล้ว

    Thank u! I am just wondering if a non-square matrix has a norm as well. Why you give [ui ui...ui] the matrix rather than the nx1size matrix [ui]?

  • @pulkitnijhawan653
    @pulkitnijhawan653 3 ปีที่แล้ว +1

    Awesome intuitive explanation :)

    • @ritvikmath
      @ritvikmath  3 ปีที่แล้ว

      Glad you liked it!

  • @manhhungnguyen4270
    @manhhungnguyen4270 3 ปีที่แล้ว

    Thank you for the explanation. I have a question
    Does this spectral norm ( 2-norm) show the "size" of the matrix just like Frobenius norm?
    I know that 2 norm of matrix shows the maximum singular value of a matrix
    and the Frobenius norm shows the "size" of a matrix
    But I am confused when you use 2 norm to compare the matrices
    When I want to compare 2 matrices, which one is better to use?

  • @noway4715
    @noway4715 4 ปีที่แล้ว

    Definition of matrix norm can have an example?

  • @KorayUlusan
    @KorayUlusan 2 ปีที่แล้ว

    10/10 video!

  • @maciejgarbacz8785
    @maciejgarbacz8785 ปีที่แล้ว +1

    I think that this video does not give a good, human-like explanation. I have figured out this thing from other sources after some time and I will leave my attempt of explanation here.
    For vectors, you would usually want to know how big they are, based on their length. Applying the Pythagoras theorem, would give that ||v||2 = sqrt(x^2 + y^2 + z^2 + ...), which is considered a 2-norm (or Euclidean norm) and can be used to measure the length of a vector in an nD space. As it turns out, these are not the only ways of measuring length of vectors, as there are different norms (And a generalized formula for an n-norm), like 1-norm (Manhattan norm), which gives the total distance to walk in straight lines: ||v||1 = |x| + |y| + |z| + ... . Last important norm is the infinity-norm, which lightly speaking tells that if you were a chess king piece on a plane, how many moves would you have to perform to get to the tip of the vector: ||v||inf = max(x, y, z, ...). All of those n-norms have a visualization, which is just a graph which you obtain after assuming the length of a vector is 1 and transforming the given formulas.
    1 = sqrt(x^2 + y^2)
    1^2 = x^2 + y^2
    y^2 = 1 - x^2
    y = sqrt(1-x^2) or y = -sqrt(1-x^2)
    (Fun Fact: That is the equation for a circle, and you can technically integrate that to get its area to obtain the value of pi, which is what Newton did to calculate the precise value of pi for the first time in history - Watch "The Discovery That Transformed Pi" by Veritassium)
    As it turns out it is also useful to measure how big the output of a matrix can get! For example if the maximum length of an output vector would be zero, you would know that the matrix gives a 0 to every vector it gets, so it is useless in many cases. This is literally a matrix norm. The matrix 2-norm means that the length of a vector will be measured by the 2-norm method I explained below. So to get that matrix norm, just plug in every possible vector with length 1 and find the output vector with maximum length.
    I hope I have helped someone in despair. I was just really frustrated how everyone on the internet just reads out the formulas and hopes the viewer will memorize them without any understanding. If there is something wrong in my comment, don't hesitate to give a reply there!

  • @bobbysokhi7296
    @bobbysokhi7296 3 หลายเดือนก่อน

    Great explanation.

    • @ritvikmath
      @ritvikmath  3 หลายเดือนก่อน

      Glad you think so!

  • @doxo9597
    @doxo9597 3 ปีที่แล้ว

    This was great, thank you!

  • @SeidelMatheus
    @SeidelMatheus 3 ปีที่แล้ว

    Great lesson!

  • @Pukimaxim
    @Pukimaxim 4 ปีที่แล้ว

    What do you mean by decay when talking about negative 9 to the power of a large number?

    • @jako276
      @jako276 4 ปีที่แล้ว +1

      I believe that he said "point 9" meaning 0.9, and by decay he means that 0.9^2 would 0.81, 0.9^3 would be 0.729, thus "decaying" to zero as exponent approaches large numbers.

  • @user-or7ji5hv8y
    @user-or7ji5hv8y 4 ปีที่แล้ว

    Awesome and clear

  • @EmilioGarcia_
    @EmilioGarcia_ 4 ปีที่แล้ว

    Hi I really enjoy your content! Quick question here, what you mean with '' bigger output "? Perhaps that with a matrix A you can span most of 'y' that belongs to R^m with the vector x that belongs to R^n ? Bit confused here, thanks for your help.

  • @tanvirkaisar7245
    @tanvirkaisar7245 ปีที่แล้ว

    could you please give me the detailed proof of ||A||||B||

  • @sandrasyperek1945
    @sandrasyperek1945 2 ปีที่แล้ว

    Great video. I like it. Concrete examples would have been nice. Thank you for making the video. :)

  • @thomasheirbaut6612
    @thomasheirbaut6612 7 หลายเดือนก่อน

    insanneeeeeee! Legend

  • @sirengineer4780
    @sirengineer4780 2 ปีที่แล้ว

    Great ! keep on bro

  • @fatihsarac662
    @fatihsarac662 ปีที่แล้ว

    Perfect

  • @potreschmotre1118
    @potreschmotre1118 3 ปีที่แล้ว

    thank you!

  • @epsilonxyzt
    @epsilonxyzt 4 ปีที่แล้ว +3

    solve an example is better than so much talk.

  • @박병현-g2e
    @박병현-g2e 3 ปีที่แล้ว

    u r really cooool~~~~~

  • @cahitskttaramal3152
    @cahitskttaramal3152 2 ปีที่แล้ว

    couldn't understand almost most of it but thanks anyway

  • @Fat_Cat_Fly
    @Fat_Cat_Fly 4 ปีที่แล้ว +1

    The editing brings jumps and looks very uncomfortable.
    The course is superb!

  • @bartlmyy
    @bartlmyy 3 ปีที่แล้ว

    merci bisou

  • @aviator1472
    @aviator1472 5 หลายเดือนก่อน +1

    Didnt understand anything.

  • @seankeaneylonergan1859
    @seankeaneylonergan1859 3 หลายเดือนก่อน

    Tldw, he doesn’t show how to calculate the norm of a vector.

  • @broda680
    @broda680 3 ปีที่แล้ว

    Did someone ever tell you that you look like kumar from the movie Harold and Kumar ? :D

    • @psychwolf7590
      @psychwolf7590 3 ปีที่แล้ว +2

      I was thinking of the same actor!! They are soo similar omg

  • @maksymfigat
    @maksymfigat 2 ปีที่แล้ว

    Thanks!