Contravariant & Covariant Components of Vectors - An Introduction to the Metric Tensor

แชร์
ฝัง
  • เผยแพร่เมื่อ 12 พ.ย. 2024

ความคิดเห็น • 36

  • @math2cool
    @math2cool ปีที่แล้ว +8

    What a gifted instructor. If you truly want to understand GR you must make this series a stop along your way. Thank you Eddie.

  • @LibertyAzad
    @LibertyAzad ปีที่แล้ว +6

    This is THE place to start studying GR. Then you can hit the books and other video lectures more profitably.

  • @christianfunintuscany1147
    @christianfunintuscany1147 9 หลายเดือนก่อน +2

    I finally got the geometric interpretation of covariant and contravariant components … thank you!!!!

  • @jesuscuadrado2291
    @jesuscuadrado2291 9 หลายเดือนก่อน +4

    The video is incredible, pure gold! The exhibition is masterful! It is the best EVER introduction to contravariant and covariant components, and metric tensor (I use many books and resources). I was reading the "Covariant Physics (Moataz)" book and I started to get lost in chapter 1.3, it is an easy chapter where the contravariant and covariant components are introduced along with the metric tensor in cartesian coordinates (metric tensor is not mentioned in any case, and it is used as the Kronecker delta like a kind of magic term to convert between contravariant and covariant components) . Only after watching the video, I have truly understood the chapter.
    I am excited about these GR videos, I have decided to study them carefully one by one. I have already seen the first one. I am a graduate in Physics who, after a long time, went back to studying certain topics to fill many gaps that were left during my career. Thanks Professor Edward D Boyes for this precious resource.

  • @astronomy-channel
    @astronomy-channel ปีที่แล้ว +6

    A superb series of videos which progress in slow logical steps. Impressive!

  • @jonathanlister5644
    @jonathanlister5644 3 หลายเดือนก่อน

    Just beyond belief! Wonderfully expressed, I love how you subtly hammer home the points to guide people away from the restrictions of geometry learned from a black board! " Be very careful what you put into that mind for you will never get it out!".

  • @perdehurcu
    @perdehurcu หลายเดือนก่อน

    Muhteşem anlatım. Çok değerli bilgiler. Teşekkürler.

  • @manishankaryadav7307
    @manishankaryadav7307 10 หลายเดือนก่อน +1

    Hi Eddie,
    Thank you for the videos. Everything (the content, the flow, the math, the questions, the voice, the presentation etc.) is fabulous.
    At time stamp 57:21 on LHS third entry's subscript needs to be 3. 🙂
    Thank you once again,
    Mani

  • @r2k314
    @r2k314 ปีที่แล้ว +2

    Your series are wonderful. If I had studied them when I first started, I would be much much further along. Thank you
    I wish I could figure how to put this at the top of the intro to GR recommendations. You have no peer!

  • @victoriarisko
    @victoriarisko 5 หลายเดือนก่อน

    Beautiful instruction! Most enjoyable to learn quite sophisticated topics

  • @jaeimp
    @jaeimp 2 หลายเดือนก่อน

    Excellent series! Thank you! The contravariant and covariant components and the lowering indices made sense for the first time. Now I am all confused about the many places in which there seems to be a need for a dual vector space with a set of dual basis vectors, etc.

  • @jimmyraconteur
    @jimmyraconteur 4 หลายเดือนก่อน

    fantastic professor

  • @jameskinnally4173
    @jameskinnally4173 ปีที่แล้ว +1

    This is an excellent series. Thank you very much!

  • @Rauf_Akbaba
    @Rauf_Akbaba ปีที่แล้ว +1

    Excellent, thank you

  • @christosgeorgiadis7462
    @christosgeorgiadis7462 11 หลายเดือนก่อน +1

    This is a great exposition of the subject thank you!
    Believe me, I have tried a lot of others ...

  • @miguelaphan58
    @miguelaphan58 7 หลายเดือนก่อน

    .
    A Master explanation !!!

  • @maaspira
    @maaspira 6 หลายเดือนก่อน +1

    Thank you very much!

  • @hasnounimohamed4710
    @hasnounimohamed4710 8 หลายเดือนก่อน

    you make it very easy to understand thank you

  • @ImranMoezKhan
    @ImranMoezKhan 7 หลายเดือนก่อน +2

    Small typographical error I suspect: for the matrix equation at around 50:46 , the right hand side is product of 1x2 and 2x2, which would produce a 1x2 row vector, but the left hand side is 2x1 column vector.
    I suppose the matrix of dot products should come first followed by the contravariant as a column vector.

  • @Mouse-qm8wn
    @Mouse-qm8wn 7 หลายเดือนก่อน

    Eddie, you made my day😊🎉! What a great video. I am looking forward to see the whole series.
    I have a question. Do you have a reference to a good GR book which contains problems and solutions for practise?

  • @messapatingy
    @messapatingy 4 หลายเดือนก่อน

    Remember Co-Low-Pro
    Co: Covariant components
    Low: Use lower indices
    Pro: Represent projections onto coordinate axes

  • @NEWDAWNrealizingself
    @NEWDAWNrealizingself 28 วันที่ผ่านมา

    THANKS !

  • @eddieboyes
    @eddieboyes  10 หลายเดือนก่อน

    Thanks Mani - another one I hadn't spotted! I'll probably put a "correction" into the Video Description rather than upload a new corrected version (as that would re-start the counts etc from scratch). Putting a "correction" into the Video Description ought to put a correction notice at the relevant time point (according to TH-cam) ...... but I can't seem to get that to work at the moment. Thanks again anyway. Eddie

  • @djordjekojicic
    @djordjekojicic ปีที่แล้ว +1

    Beautiful explanations and examples. I've seen a lot of GR videos but this series is one of the best. I have one question though. Why are covariant vector components with lower indices presented on same axis when they belong to dual vectors basis which are different in such manner that each one is perpendicular to the original one with different index?

    • @jesuscuadrado2291
      @jesuscuadrado2291 9 หลายเดือนก่อน

      You are right, it is probably a simplification to not introduce dual basis vectors

    • @jesuscuadrado2291
      @jesuscuadrado2291 9 หลายเดือนก่อน

      In any case, you can also have the covariant components with respect to the contravariant basis: th-cam.com/video/nNMY02udkHw/w-d-xo.htmlsi=l0YONDtcKOnksqLt&t=468

  • @kevincleary627
    @kevincleary627 ปีที่แล้ว

    Great videos. Is there a playlist so that I can watch them sequentially? Thanks!

  • @r2k314
    @r2k314 ปีที่แล้ว

    Thank You!

  • @sagsolyukariasagi
    @sagsolyukariasagi 5 หลายเดือนก่อน

    Should the basis vectors always come from the covariant bases?

  • @SphereofTime
    @SphereofTime 6 หลายเดือนก่อน +1

    1:00

  • @RBRB-hb4mu
    @RBRB-hb4mu 6 หลายเดือนก่อน

    Black background with light letters please, it make it easier to learn

    • @2pizen
      @2pizen 6 หลายเดือนก่อน

      second that!

  • @Yuri-w8k4j
    @Yuri-w8k4j 11 หลายเดือนก่อน

    1:06:52 Isn’t it a bit early to name g a tensor? We know :) that not every object with indexes is a tensor, right?

  • @jameshopkins3541
    @jameshopkins3541 ปีที่แล้ว

    You has no explain it!!!!! SoNOLIKE

  • @Altalex988
    @Altalex988 9 หลายเดือนก่อน

    Thnks for the serie!
    One question: the proof @32:25 holds only if basis vectors are unit vector, otherwise general formula is V_n=gmnV_upper_n ?
    Tnhks, bye

  • @JoseAntonio-ml8yg
    @JoseAntonio-ml8yg 5 หลายเดือนก่อน

    Thank you very much!