Contravariant & Covariant Components of Vectors - An Introduction to the Metric Tensor

แชร์
ฝัง
  • เผยแพร่เมื่อ 12 ม.ค. 2023
  • In this video (GR - 04), we take the idea of one-dimensional Contravariant and Covariant vectors, and move to thinking about TWO dimensional space, and the vectors in that space having two types of components - again called ‘Contravariant’ and ‘Covariant’. This leads on to a simple introduction to the ‘Metric Tensor’. On the way to this, the Einstein Summation Convention is introduced, which will be used from now on to reduce long equations to a much simpler-looking form.
    This video is part of a series of videos on General Relativity (GR-01 to GR-20), which has been created to help someone who knows a little bit about “Newtonian Gravity” and “Special Relativity” to appreciate both the need for “General Relativity”, and for the way in which the ‘modelling’ of General Relativity helps to satisfy that need - in the physics sense.
    The production of these videos has been very much a ‘one man band’ from start to finish (‘blank paper’ to ‘final videos’), and so there are bound to be a number of errors which have slipped through. It has not been possible, for example, to have them “proof-watched” by a second person. In that sense, I would be glad of any comments for corrections ……. though it may be some time before I get around to making any changes.
    By ‘corrections and changes’ I clearly do not mean changes of approach. The approach is fixed - though some mistakes in formulae may have been missed in my reviewing of the final videos, or indeed some ‘approximate explanations’ may have been made which were not given sufficient ‘qualification’. Such changes (in formulae, equations and ‘qualifying statements’) could be made at some later date if they were felt to be necessary.
    “Correction:” 56:51 The column vector on the left hand side should read (downwards) V1, V2, V3
    This video (and channel) is NOT monetised

ความคิดเห็น • 28

  • @victoriarisko
    @victoriarisko 16 ชั่วโมงที่ผ่านมา

    Beautiful instruction! Most enjoyable to learn quite sophisticated topics

  • @math2cool
    @math2cool 6 หลายเดือนก่อน +6

    What a gifted instructor. If you truly want to understand GR you must make this series a stop along your way. Thank you Eddie.

  • @LibertyAzad
    @LibertyAzad 8 หลายเดือนก่อน +2

    This is THE place to start studying GR. Then you can hit the books and other video lectures more profitably.

  • @astronomy-channel
    @astronomy-channel 7 หลายเดือนก่อน +6

    A superb series of videos which progress in slow logical steps. Impressive!

  • @maaspira
    @maaspira 16 วันที่ผ่านมา +1

    Thank you very much!

  • @miguelaphan58
    @miguelaphan58 หลายเดือนก่อน

    .
    A Master explanation !!!

  • @christianfunintuscany1147
    @christianfunintuscany1147 3 หลายเดือนก่อน +1

    I finally got the geometric interpretation of covariant and contravariant components … thank you!!!!

  • @jesuscuadrado2291
    @jesuscuadrado2291 3 หลายเดือนก่อน +2

    The video is incredible, pure gold! The exhibition is masterful! It is the best EVER introduction to contravariant and covariant components, and metric tensor (I use many books and resources). I was reading the "Covariant Physics (Moataz)" book and I started to get lost in chapter 1.3, it is an easy chapter where the contravariant and covariant components are introduced along with the metric tensor in cartesian coordinates (metric tensor is not mentioned in any case, and it is used as the Kronecker delta like a kind of magic term to convert between contravariant and covariant components) . Only after watching the video, I have truly understood the chapter.
    I am excited about these GR videos, I have decided to study them carefully one by one. I have already seen the first one. I am a graduate in Physics who, after a long time, went back to studying certain topics to fill many gaps that were left during my career. Thanks Professor Edward D Boyes for this precious resource.

  • @r2k314
    @r2k314 ปีที่แล้ว +2

    Your series are wonderful. If I had studied them when I first started, I would be much much further along. Thank you
    I wish I could figure how to put this at the top of the intro to GR recommendations. You have no peer!

  • @hasnounimohamed4710
    @hasnounimohamed4710 2 หลายเดือนก่อน

    you make it very easy to understand thank you

  • @manishankaryadav7307
    @manishankaryadav7307 4 หลายเดือนก่อน +1

    Hi Eddie,
    Thank you for the videos. Everything (the content, the flow, the math, the questions, the voice, the presentation etc.) is fabulous.
    At time stamp 57:21 on LHS third entry's subscript needs to be 3. 🙂
    Thank you once again,
    Mani

  • @Rauf_Akbaba
    @Rauf_Akbaba 6 หลายเดือนก่อน +1

    Excellent, thank you

  • @jameskinnally4173
    @jameskinnally4173 ปีที่แล้ว +1

    This is an excellent series. Thank you very much!

  • @eddieboyes
    @eddieboyes  4 หลายเดือนก่อน

    Thanks Mani - another one I hadn't spotted! I'll probably put a "correction" into the Video Description rather than upload a new corrected version (as that would re-start the counts etc from scratch). Putting a "correction" into the Video Description ought to put a correction notice at the relevant time point (according to TH-cam) ...... but I can't seem to get that to work at the moment. Thanks again anyway. Eddie

  • @Mouse-qm8wn
    @Mouse-qm8wn หลายเดือนก่อน

    Eddie, you made my day😊🎉! What a great video. I am looking forward to see the whole series.
    I have a question. Do you have a reference to a good GR book which contains problems and solutions for practise?

  • @djordjekojicic
    @djordjekojicic 8 หลายเดือนก่อน +1

    Beautiful explanations and examples. I've seen a lot of GR videos but this series is one of the best. I have one question though. Why are covariant vector components with lower indices presented on same axis when they belong to dual vectors basis which are different in such manner that each one is perpendicular to the original one with different index?

    • @jesuscuadrado2291
      @jesuscuadrado2291 3 หลายเดือนก่อน

      You are right, it is probably a simplification to not introduce dual basis vectors

    • @jesuscuadrado2291
      @jesuscuadrado2291 3 หลายเดือนก่อน

      In any case, you can also have the covariant components with respect to the contravariant basis: th-cam.com/video/nNMY02udkHw/w-d-xo.htmlsi=l0YONDtcKOnksqLt&t=468

  • @r2k314
    @r2k314 ปีที่แล้ว

    Thank You!

  • @forheuristiclifeksh7836
    @forheuristiclifeksh7836 6 วันที่ผ่านมา +1

    1:00

  • @christosgeorgiadis7462
    @christosgeorgiadis7462 5 หลายเดือนก่อน

    This is a great exposition of the subject thank you!
    Believe me, I have tried a lot of others ...

  • @ImranMoezKhan
    @ImranMoezKhan หลายเดือนก่อน

    Small typographical error I suspect: for the matrix equation at around 50:46 , the right hand side is product of 1x2 and 2x2, which would produce a 1x2 row vector, but the left hand side is 2x1 column vector.
    I suppose the matrix of dot products should come first followed by the contravariant as a column vector.

  • @kevincleary627
    @kevincleary627 10 หลายเดือนก่อน

    Great videos. Is there a playlist so that I can watch them sequentially? Thanks!

  • @RBRB-hb4mu
    @RBRB-hb4mu 24 วันที่ผ่านมา

    Black background with light letters please, it make it easier to learn

    • @2pizen
      @2pizen 13 วันที่ผ่านมา

      second that!

  • @user-gx8xs4ib4u
    @user-gx8xs4ib4u 5 หลายเดือนก่อน

    1:06:52 Isn’t it a bit early to name g a tensor? We know :) that not every object with indexes is a tensor, right?

  • @jameshopkins3541
    @jameshopkins3541 8 หลายเดือนก่อน

    You has no explain it!!!!! SoNOLIKE

  • @Altalex988
    @Altalex988 3 หลายเดือนก่อน

    Thnks for the serie!
    One question: the proof @32:25 holds only if basis vectors are unit vector, otherwise general formula is V_n=gmnV_upper_n ?
    Tnhks, bye