Oxford Linear Algebra: Spectral Theorem Proof

แชร์
ฝัง
  • เผยแพร่เมื่อ 17 พ.ย. 2022
  • University of Oxford mathematician Dr Tom Crawford goes through a full proof of the Spectral Theorem. Check out ProPrep with a 30-day free trial to see how it can help you to improve your performance in STEM-based subjects: www.proprep.uk/info/TOM-Crawford
    Test your understanding of the content covered in the video with some practice exercises courtesy of ProPrep. You can download the workbooks and solutions for free here: www.proprep.uk/Academic/Downl...
    You can also find several video lectures from ProPrep explaining the Spectral Theorem here: www.proprep.uk/general-module...
    And further videos explaining the Gram-Schmidt process are here: www.proprep.uk/general-module...
    Finally, fully worked video solutions from ProPrep instructors are here: www.proprep.uk/general-module...
    Watch other videos from the Oxford Linear Algebra series at the links below.
    Solving Systems of Linear Equations using Elementary Row Operations (ERO’s): • Oxford Linear Algebra:...
    Calculating the inverse of 2x2, 3x3 and 4x4 matrices: • Oxford Linear Algebra:...
    What is the Determinant Function: • Oxford Linear Algebra:...
    The Easiest Method to Calculate Determinants: • Oxford Linear Algebra:...
    Eigenvalues and Eigenvectors Explained: • Oxford Linear Algebra:...
    The video goes through a full proof of the Spectral Theorem, which states that every real, symmetric matrix, has real eigenvalues, and can be diagonalised using a basis of its eigenvectors.
    The first part of the proof uses the eigenvalue equation to show that any eigenvalue is in fact equal to its complex conjugate, and thus is real.
    The second part of the proof shows that a matrix similarity transformation using an orthogonal matrix exists, and results in a diagonal matrix. We first construct an orthonormal basis (where the first vector is an eigenvector) using the Gram-Schmidt process, and then use these vectors as the columns of our orthogonal matrix. Next, we show that the resulting similarity matrix is also symmetric. This then allows us to conclude that the first row and first column are diagonal as required. The final step is to use induction on the size of the matrix. Assuming the result is true for a (n-1) x (n-1) matrix, we use our earlier calculation to construct the final orthogonal matrix, and show that when it is used as a change of basis matrix the result is diagonal, as we wanted.
    Produced by Dr Tom Crawford at the University of Oxford. Tom is an Early-Career Teaching and Outreach Fellow at St Edmund Hall: www.seh.ox.ac.uk/people/tom-c...
    For more maths content check out Tom's website tomrocksmaths.com/
    You can also follow Tom on Facebook, Twitter and Instagram @tomrocksmaths.
    / tomrocksmaths
    / tomrocksmaths
    / tomrocksmaths
    Get your Tom Rocks Maths merchandise here: beautifulequations.net/collec...

ความคิดเห็น • 38

  • @TomRocksMaths
    @TomRocksMaths  ปีที่แล้ว +4

    Check out ProPrep with a 30-day free trial to see how it can help you to improve your performance in STEM-based subjects: www.proprep.uk/info/TOM-Crawford

  • @wescraven2606
    @wescraven2606 ปีที่แล้ว +30

    I just realized I had Linear Algebra 17 years ago. In 3 more years, it will be the median of my life. I'm starting to feel old.

  • @fordtimelord8673
    @fordtimelord8673 ปีที่แล้ว +13

    I subscribed to your channel a couple months ago, but have not watched a single video. This video showed up on my home page.
    This is the best presentation and proof of the spectral theorem I have seen. Beautiful logic and clarity of thought. Thank you.

  • @jonwilson08
    @jonwilson08 ปีที่แล้ว +8

    In the statement of the Spectral Theorem, note that we really want to say "there exists an _orthonormal_ basis of R^n which consists of eigenvectors of A" (otherwise the equivalent condition you write in terms of orthogonal matrices is quite misleading!)
    Thanks for the fun video though!

  • @ranpancake
    @ranpancake ปีที่แล้ว +5

    love how clear your explanations are, proprep seems super worth getting too 😋

  • @nickybutt9733
    @nickybutt9733 ปีที่แล้ว +2

    Fella I was always told I was thick as muck in Maths at school. Yelled at my teachers, and sent out of the class for not understanding algebra. Would have really loved someone like you to inspire me. Instead I've been terrified my whole life of Maths.

  • @jacksonwilloughby7625
    @jacksonwilloughby7625 ปีที่แล้ว +1

    I just went over this before thanksgiving, thank you for the clarification of this.

  • @homejonny9326
    @homejonny9326 ปีที่แล้ว +3

    that theorem blew my mind when i was in college...

  • @xAndr3Bx
    @xAndr3Bx ปีที่แล้ว +8

    Thank you, that was really interesting. I've come bake to my course of linear algebra at first year of university :')

  • @khbye2411
    @khbye2411 ปีที่แล้ว +4

    Very clear explanation!
    Would it be possible to have a video explaining the proof for Cochran's theorem? Thank you!

  • @thriving_gamer
    @thriving_gamer ปีที่แล้ว +1

    Thanks 🙏🏻 Thanks a lot for this informative and useful video ❤️

  • @alejandrogarcia-wg2kp
    @alejandrogarcia-wg2kp ปีที่แล้ว +3

    Literaly just started doing this 5m ago. Thank you

  • @ThePiMan0903
    @ThePiMan0903 ปีที่แล้ว +1

    Nice video sir Tom!

  • @iamtraditi4075
    @iamtraditi4075 ปีที่แล้ว +2

    Gorgeous; thank you :)

  • @nestordavidparedeschoque9895
    @nestordavidparedeschoque9895 ปีที่แล้ว +4

    This is extraordinary

  • @user-pp6zj8ow8j
    @user-pp6zj8ow8j ปีที่แล้ว

    Just one question, how do se prove that A actually has any eigenvalue? Does it come directly from its simmetry?

  • @ChrisOffner
    @ChrisOffner 10 หลายเดือนก่อน

    Do I understand correctly that _v'_ is the component-wise conjugate, i.e. _v = (a + bi, c + di) => v' = (a - bi, c - di)?_ If so, is the inner product of v with its conjugate v', i.e. _v^T * v',_ really equal to the inner product of v with itself, i.e. _v^T * v,_ as shown at ~10:55?

  • @alovyaachowdhury1687
    @alovyaachowdhury1687 9 หลายเดือนก่อน +1

    In case anyone wants to know why the first statement of part (II) is equivalent to saying there is an orthogonal matrix R such that R-1AR is diagonal, the intuition can be found from 3b1b's video about eigenvectors starting from here: th-cam.com/video/PFDu9oVAE-g/w-d-xo.html
    Thanks a lot for this really clear proof Tom - there's loads of examples online but thanks for actually walking us through it :)

  • @student5544
    @student5544 2 หลายเดือนก่อน

    Lot's of thanks from India sir 😅

  • @reunguju7501
    @reunguju7501 ปีที่แล้ว +1

    謝謝!

  • @oraz.
    @oraz. ปีที่แล้ว +2

    He's got good chalk writing

  • @admink8662
    @admink8662 ปีที่แล้ว +1

    Nice

  • @arekkrolak6320
    @arekkrolak6320 ปีที่แล้ว +1

    nice, but what kind of symmetry does the matrix have? Symmetry of rotation? Center of symmetry? Axis of symmetry? Any of the above?

    • @MrAlRats
      @MrAlRats ปีที่แล้ว +2

      A matrix is said to be symmetric if it's equal to its transpose.

  • @Shaan_Suri
    @Shaan_Suri หลายเดือนก่อน

    What exactly is "v bar"? Is it the 'conjugate' of vector v? I'm confused

  • @motherflerkentannhauser8152
    @motherflerkentannhauser8152 ปีที่แล้ว

    What does II of the thm. say if R was changed to C or some other field? Is the proof any different if it was done on linear maps between arbitrary inner-product spaces instead of Euclidean spaces? What does the thm. say if the dimension was infinite?

    • @amritlohia8240
      @amritlohia8240 ปีที่แล้ว +1

      The theorem also works over C, but you need to change from symmetric matrices to Hermitian matrices (i.e. matrices that equal their *conjugate* transpose). The proof works in the same way for arbitrary inner product spaces. If the dimension is infinite, one essentially gets into functional analysis and there are various spectral theorems - e.g. the same statement as the basic spectral theorem holds for compact self-adjoint operators on a (real or complex) Hilbert space. Generalising beyond that, in order for the statement to remain true, you also have to generalise your notion of eigenvectors, and this rapidly gets rather complicated.

  • @daydreamer05
    @daydreamer05 ปีที่แล้ว +1

    You are an example of "don't judge a book by it's cover."

  • @Watermelon1.0
    @Watermelon1.0 ปีที่แล้ว

    Hi Tommy 😁

  • @La_Maudite
    @La_Maudite ปีที่แล้ว

    Isn't the proof by induction a bit of overkill her? ;-)
    Just considering e_i instead of e_1 and deducing that A_{i,i} = 1, and A_{i,j} = 0 for i
    e j does the trick, no?

    • @TomRocksMaths
      @TomRocksMaths  ปีที่แล้ว +4

      We only know that v1 is an eigenvector. So the other columns don’t necessarily reduce to be diagonal.

    • @La_Maudite
      @La_Maudite ปีที่แล้ว +2

      @@TomRocksMaths Ha, forgot about that fact. Thanks!

  • @pieTone
    @pieTone ปีที่แล้ว +1

    This is cool and all but never forget that this is the same guy who forgot his circle theorems :)
    Jk man you are awesome!

  • @chriscox5352
    @chriscox5352 ปีที่แล้ว +1

    💐 ᵖʳᵒᵐᵒˢᵐ

  • @lucasm.b.4390
    @lucasm.b.4390 ปีที่แล้ว

    You haven’t shown there is at least one real eigenvalue for A.

    • @lucasm.b.4390
      @lucasm.b.4390 ปีที่แล้ว

      It should follow easily from the fundamental theorem of algebra. Great video nonetheless.

  • @krishanuchattopadhyay7006
    @krishanuchattopadhyay7006 ปีที่แล้ว

    Uh easy

  • @prajananandaraj5847
    @prajananandaraj5847 ปีที่แล้ว

    He doesn't even look like a mathematician, cuz when I saw him the first time, I thought he was some kind of musician