The Covariant Derivative (and Christoffel Terms)

แชร์
ฝัง
  • เผยแพร่เมื่อ 24 ธ.ค. 2024

ความคิดเห็น • 24

  • @adeelakhtar3540
    @adeelakhtar3540 10 วันที่ผ่านมา

    The “English” accent makes this video a bit more enjoyable. Thank you for your time and explanation!

  • @arezaajouneghani3082
    @arezaajouneghani3082 11 หลายเดือนก่อน +4

    The most comprehensive and lucid lecture on Christoffel that I've encountered on the internet-extremely didactic and profoundly stimulating for delving deeper into this subject. Undoubtedly, a superb teacher!

  • @peterhall6656
    @peterhall6656 หลายเดือนก่อน

    Eddie this is really nice work. A timeless pedagogical gem of which you should be very proud. When Einstein was developing his theory of GR he had to teach himself tensor calculus and in the early 20th century there were textbooks by Ricci and Levi-Civita. About 30 years ago I retraced his steps using Levi-Civita's "The Absolute Differential Calculus" and the original GR papers and it was hard work I can tell you. I still remain in awe of how he conceptualized the physics and then translated it into the abstract maths.

  • @jimgolab536
    @jimgolab536 ปีที่แล้ว +7

    I very much like your approach of starting in 1D and discussing all the pieces, and only then adding one more dimension.

  • @skbhatta1
    @skbhatta1 ปีที่แล้ว +5

    The best exposition for the covariant derivative that I have seen so far. Looking forward to seeing the rest.. Thanks a lot.

  • @eustacenjeru7225
    @eustacenjeru7225 ปีที่แล้ว +2

    The lecture has improved my understanding on covariant derivative

  • @christianfunintuscany1147
    @christianfunintuscany1147 10 หลายเดือนก่อน +2

    Thanks again for this precious lecture

  • @christianfunintuscany1147
    @christianfunintuscany1147 10 หลายเดือนก่อน

    The interpretation of the metric tensor entries was impressive! Thank you!

  • @BLEKADO
    @BLEKADO ปีที่แล้ว +1

    MARAVILLOSO, MARVELOUS, MERVELLEUSE, MIRINDA.

  • @eddieboyes
    @eddieboyes  ปีที่แล้ว +3

    The original video GR-07 was ‘published’ in January 2023 and by July 2023 had received 969 views and 26 likes. It had also received the comments below. It was then replaced by the current GR-07 (July 2023). Hopefully, some of the minor issues mentioned in the comments below have been addressed in this newer version.
    @lowersaxon
    July 2023
    Brilliant. Didactically the best exposition possible for all beginners, imho. Isnt that Christoffel symbol a gamma and not a lambda?
    @darkangel105100
    June 2023
    Can also be expressed in matrix form ?
    @aashraykannan5027
    June 2023
    Eddie - really appreciated this video; helped me work through some GR roadblocks.
    @aashraykannan5027
    May 2023
    Eddie, I think there is an error in the Einstein summation of the covariant derivative of covariant vectors, the one with the minus symbol. At about 43:48. The upper index of the Christoffel symbol should not be the "r", but the "n". The "r" should appear at the lower indices. Nevertheless, I am very grateful for your video's, they help a lot!
    @r2k314
    April 2023
    Thank You very much. This really help a novice get oriented. Also this is the first time i've seen the idea behind the metric formula for the Christoffel symbols. Again thank you for your time and efforts

  • @duycuongnguyen227
    @duycuongnguyen227 8 หลายเดือนก่อน +1

    Excellent explanations!!!

  • @channaparanavitane4569
    @channaparanavitane4569 9 วันที่ผ่านมา

    Thank you for the great series. One question - intuitively, what is the difference between the ‘measured derivative’ and the ‘actual derivative’? All physical quantities require a measuring stick don’t they?

  • @l-erusal
    @l-erusal ปีที่แล้ว +1

    Thank you for brilliant lecture. Just one small correction (probably) - I noticed that you called g11 a "scalar". But scalar is the same in all coordinate systems and g11 will be different.

  • @thevegg3275
    @thevegg3275 9 หลายเดือนก่อน

    A fundamental question about The Vr in the red 5:57 circle at minute 6:33.
    Aside from the fact, that r is a dummy variable, why not also let it be Vn thus making it super clear that on both sides of the equation you have Vn.
    And sure, it could stand for both the instances of the measured component, and the dummy variable. Why would that be more complex for the simple sake you have fewer variables?

  • @yancymuu4977
    @yancymuu4977 ปีที่แล้ว +2

    Thanks for the really good video's. I am still struggling with covariant and contravarient vectors. Some questions: if we generate the covariant components of a vector, does that make it a covariant vector? Can the gradient vector be expressed by contravarient components? If I draw out the covariant components of some vector, it would seem like increasing the size of the basis vectors would decrease the size of the vector components just as with a contravariant vector. How have I got this wrong. Thanks again for the great content.

    • @thevegg3275
      @thevegg3275 ปีที่แล้ว

      "If I draw out the covariant components of some vector, it would seem like increasing the size of the basis vectors would decrease the size of the vector components just as with a contravariant vector".
      I had the same question.
      Contravar: If BV goe up, comp go down.
      Covar: If BV go up, comp go up.
      It is either bc
      the basis vectors in covariant axes are 1
      which would lead to contravariance.
      or both!
      Someone please correct me if I'm wrong.

    • @gso.astrowe
      @gso.astrowe ปีที่แล้ว

      ​@@thevegg3275as I understand it, the covar is essentially telling you how to modify the length of the actual vector, e.g. given a unit vector, 5*1 would essentially be a vector 5x's the length of the unit vector. Thus, this grows or shrinks in the same way that you modify the vector.
      The contravar is telling you how the underlying space is modified; think of it as "# lines pierced". This grows or shrinks opposite the vector. So, again, given a unit vector modified by contra of 5, you are essentially saying that the single vector passes through 5 unit measurements, e.g. lines.
      In both cases, the vector is still the same size, we have just changed how we define the coordinate system. If we change the units of measuring the vector, it is covar; if change the units of the field, it is contravar.
      Someone feel free to correct me if I am wrong.

  • @palfers1
    @palfers1 11 หลายเดือนก่อน

    Really good. I was however disappointed that you chickened out of deriving the full expression for the Christoffels.

  • @thevegg3275
    @thevegg3275 ปีที่แล้ว

    Love this! Min 6:22 in the second termon the rhs, why isn't V^r named V^n so it matches dV*n in the first term. They are the same coordinates. Just the first term is the partial derivative and the second term is not. Thanks!

    • @hershyfishman2929
      @hershyfishman2929 10 หลายเดือนก่อน

      The r's in each factor of that term (upper index for V, lower index for Γ) are dummy indices. They disappear after being summed over, and what is left is n upper, m lower, just as in the lhs

    • @thevegg3275
      @thevegg3275 10 หลายเดือนก่อน

      @@hershyfishman2929 Thank you! Is there any connection between the covariant vector (formed by parallel projection on a graph-vs-contravariant vector formed by perpendicular projection) and a tensor's covariant indices? I know the math of finding the vector covariant vector but have no clue how it relates to tensors.

    • @hershyfishman2929
      @hershyfishman2929 10 หลายเดือนก่อน +1

      @@thevegg3275 Yes, a rank 1 tensor is a vector, and the position of its indices (higher vs. lower) indicate whether it is co- or contravariant. The same idea extends to higher rank tensors.
      To be clear, a vector is a an object with a magnitude and direction, independent of coordinates, but every vector can be represnted by either covariant or contravariant components (different sets of numbers), each with the appropriate type of basis vectors. I believe, though i could be wrong, that this aspect does not extend to tensors of higher rank which are more complex objects and are not invariant. [update: In Sean Carroll's textbook on GR p. 26 he says that "tensors generally have a 'natural' definition independent of the metric"]

    • @thevegg3275
      @thevegg3275 10 หลายเดือนก่อน

      Ah. So, maybe there is no connection between a rank one vector and a rank one tensor?

    • @hershyfishman2929
      @hershyfishman2929 10 หลายเดือนก่อน

      @@thevegg3275 no, a vector is a rank 1 tensor