What is a Tensor 6: Tensor Product Spaces

แชร์
ฝัง
  • เผยแพร่เมื่อ 16 ม.ค. 2025

ความคิดเห็น • 102

  • @TheNutellamaster
    @TheNutellamaster 4 ปีที่แล้ว +7

    Even though it has been a while since you uploaded this video, I want to express my sincere gratitude to you for having taken the time preparing this series. I‘ve only watched two now, not even starting from the beginning, but this has already cleared up so many questions I carried around since the very beginnings of my studies. Thank you so much!

  • @johnbest7135
    @johnbest7135 7 ปีที่แล้ว +12

    Terrific series so far. For the first time I am getting some clarity on this subject. Most grateful to you.

  • @alexkatko431
    @alexkatko431 8 หลายเดือนก่อน

    8 years later and still an amazing set of videos

  • @PunmasterSTP
    @PunmasterSTP 2 ปีที่แล้ว +1

    TPS? Maybe that stands for "The Perfect Showcase"...of information! Thanks so much for making this series.

  • @jackozeehakkjuz
    @jackozeehakkjuz 8 ปีที่แล้ว +4

    Tensors look so natural now. Thank you, again. I'll finish this series for sure. :)

  • @ozzyfromspace
    @ozzyfromspace 4 ปีที่แล้ว +2

    I’ve been trying to understand these lectures for quite some time now. I always come back to them. I’m happy to say I finally get it 😊🙌🏽🎊🥳! I needed a few supplementary videos that introduced geometry with examples, so now I get it. Thank you, your videos are a godsend!

    • @XylyXylyX
      @XylyXylyX  4 ปีที่แล้ว +3

      I am glad to hear that you persisted. To find a path to understanding requires persistence.

  • @rktiwa
    @rktiwa 4 ปีที่แล้ว +1

    Even though you have already maddened me with your manifold, Christffel symbols, curvature and all that but given your teaching prowess I wish you make a series on QFT because you are the one who presume the bare minimum on the part of your viewers while explaining such complex theory. You I want to be mader.

  • @billguastalla1392
    @billguastalla1392 4 ปีที่แล้ว +1

    Thanks for these videos Xyly - I love your operator overloading comment (it could be a template parameter!) and having moved towards algebra after spending time working with C++/Haskell I think this attitude is underrated. Thinking about the underlying "type system" of symbols brings new ways to engage with maths!

  • @georgeorourke7156
    @georgeorourke7156 8 ปีที่แล้ว +4

    Great lecture - I have one suggestion on the Expansion of Tensor Product: at 20:00 to 21:30 you quickly go over how a tensor product acts on the underlying cartesian product spaces. It was done quickly and the T (subscript mu nu) were not included in the demonstration. Although I know that it is straight forward if you could go over the examples in greater detail we would have the certitude that we're tracking 100%.

    • @XylyXylyX
      @XylyXylyX  8 ปีที่แล้ว +1

      Ok, I'm glad you found the answer. Let me know if there is a gap, I'll try to fill it. (BTW I accidentally removed your last comment, sorry)

  • @enricolucarelli816
    @enricolucarelli816 7 ปีที่แล้ว +9

    If I understood things right in 15:00 there is a mistake: the tensor R should map the Cartesian product of covectors to R and the tensor T should map the Cartesian product of vectors to R, and not the other way around as stated there. Am I right?
    Thank you so much for these excellent lectures.

    • @XylyXylyX
      @XylyXylyX  7 ปีที่แล้ว +5

      Enrico Lucarelli Yes, you are correct. You are probably viewing this on a mobile device? My corrections/annotations do not appear on mobile devices. THis error has been noticed before. Look in the comments for more information. But regardless, you obviously understand.

  • @jason_liam
    @jason_liam 4 ปีที่แล้ว +1

    @21:09 Shouldn't this be opposite? For example it should be instead of . I am talking about the first term in the expansion. Though they are commutative but still shouldn't it be the opposite?

  • @aky68956
    @aky68956 4 ปีที่แล้ว

    I didn't understand the order of vectors and maps within the brackets at 21:25. shouldn't the maps be e0 and e1?

  • @Mariusz-rj8ye
    @Mariusz-rj8ye 8 ปีที่แล้ว +1

    Thanks for great lecture. I want to ask if at 14:58 subscripts and superscripts in T and R shouldn't be opposite (should be: T sup alfa.. e sub alfa...)?

    • @XylyXylyX
      @XylyXylyX  8 ปีที่แล้ว +1

      Mariusz2718 Yes! You are correct! The tensor that maps V x V x V.... to R must have components with *subscripts* and covectors in the basis vector! Nice catch! I'll make an annotation later today!

  • @joelhaggis5054
    @joelhaggis5054 7 ปีที่แล้ว

    At 8:48 you can hear he's like " Why did I use so many tensor products? This was a mistake."

  • @apm77
    @apm77 7 ปีที่แล้ว +1

    I don't understand why, at 17:40, you invert the order of the vector spaces when showing the correspondence between the tensor products and the Cartesian products. That is, I don't understand why you say that V*⨂V corresponds to V⨯V* instead of, as I would expect, to V*⨯V (will TH-cam let me use Unicode symbols in comments? I do hope so). I do get that it probably doesn't matter a great deal, since in this context the Cartesian products are only a formalism that allows you to define a domain, but why complicate things by inverting the order? You do the same thing at 23:00 so it's not an anomaly.
    I don't have any kind of background in this stuff. I did a couple of years of undergraduate mathematics last millennium but was never any good at it. I started watching this series essentially because TH-cam recommended it, and I was curious.

    • @XylyXylyX
      @XylyXylyX  7 ปีที่แล้ว

      Adrian Morgan The ordering with the tensor product symbol *is the map* and the ordering with the Cartesian product ( the plain "x" without a circle around it) is *what is being mapped* The map named "V* @ V" takes an order pair from V X V* to the real numbers.

    • @captainunicode
      @captainunicode 7 ปีที่แล้ว

      Thanks, I was confused there for a moment, too.

  • @viktorzelezny1518
    @viktorzelezny1518 4 ปีที่แล้ว +2

    Thank you (again) for your amazing lectures! I am currently writing my master's thesis about Hilbert Space Factorization, and your Videos about tensors are a huge help :)
    I just have one question: How do you rigorously formulate the addition operation for tensor products? In a vector space we can always write a+b=c, with a,b,c Vectors. But for tensors this doesn't work, e.g. the bell state |1}@|0} + |0}@|1} cannot be expressed by a single tensor |a}@|b}. Can this be resolved by simply renaming the tensors |1}@|0} == A, |0}@|1} == B , |1}@|0} + |0}@|1} == C and so on, or is there a more fundamental solution that i am missing?
    Best regards and again thank you
    Viktor

    • @XylyXylyX
      @XylyXylyX  4 ปีที่แล้ว +1

      You are missing somethings. The Bell state that you wrote is the sum of two *basis* vectors. If you cast any vector space (including any tensor product space) in a basis you are stuck with those basis vectors. The basis in your case is |0>|0> and |0>|1> and |1>|0> and |1>|1> so EVERY member of the tensor product space must be cast in this basis. So your “a + b = c” is written in *coordinate free* form so it appears to you that they combine, but in a basis it would be a = a_1 e_1 + a_2 e_2 and b= b_1 e_1 + b_2 e_2 so “c” would be (a_1 + b_1) e_1 + (a_2 + b_2) e_2. So you are comparing two different formalisms!

    • @viktorzelezny1518
      @viktorzelezny1518 4 ปีที่แล้ว +1

      @@XylyXylyX Yes! Thank you for your answer :) Me not distinguishing those two cases was the reason for many of my misconceptions about tensor spaces

    • @PunmasterSTP
      @PunmasterSTP 2 ปีที่แล้ว

      Hey I know it's been a few years but I just came across your comment and was curious. How did your thesis turn out?

  • @drlangattx3dotnet
    @drlangattx3dotnet 7 ปีที่แล้ว +1

    Do the basis vectors of the vector space have superscripts or subscripts? Do the covectors have super or sub?

    • @drlangattx3dotnet
      @drlangattx3dotnet 7 ปีที่แล้ว

      I went back to lecture 2 and checked and answered my own question.

    • @drlangattx3dotnet
      @drlangattx3dotnet 7 ปีที่แล้ว

      I am still confused when I read your inserted correction. Can you help me with an answer? Thanks.

  • @jasonbroadway8027
    @jasonbroadway8027 2 ปีที่แล้ว

    At 20:32, I would have used subscripts and superscripts other than mu and nu when expaning the input covector beta and the input vector v. You could then use the Kronecker delta to reel things back in.

  • @frankbennett2429
    @frankbennett2429 7 ปีที่แล้ว

    No, I don't get your "oops" comment at 15:00. You original indices, upper and lower, seem consistent with what you have used earlier in this lecture. For example, when you have a cross product of dual spaces, don't you need basis vectors with upper indices to represent co-vectors? I am enjoying these lectures.

    • @frankbennett2429
      @frankbennett2429 7 ปีที่แล้ว

      I think I get it now. If you have a Cartesian product of dual spaces, the corresponding map indices have to come from the underlying vector space, using the fact that a vector in the underlying space is also a map.

  • @jasonbroadway8027
    @jasonbroadway8027 4 ปีที่แล้ว

    At 25:14, you mentioned the fact that basis elements are not carried with the tensor symbols. I opine that this omission hinders a student of relativity, and I feel that shorthand is the bain of physics. I do not mind the Einstein summation notation, but I feel that Einstein did physics and mathematics a disservice when he omitted the basis elements from the tensor components. Nevertheless, I have enjoyed your excellent videos on the subject of tensors.

    • @XylyXylyX
      @XylyXylyX  4 ปีที่แล้ว

      Once you understand them it is best to drop them. The problem is that until you keep them for a while it is hard to understand what is going on.

    • @jasonbroadway8027
      @jasonbroadway8027 2 ปีที่แล้ว

      I get this now.

  • @apm77
    @apm77 6 หลายเดือนก่อน

    Suppose an expression like [T⊗U](V,W) was written in a notation that does not distinguish between A⊗B and (A,B), that is, in which the tensor product of A and B is indistinguishable from the ordered pair of vectors A and B, so for example [T⊗U](V,W) looks identical to [T⊗U][V⊗W]. Would any information actually be lost? In one sense, clearly yes, but that's because the null operator is overloaded: depending on the arguments we interpret the absence of an operator to mean either "acting upon" or "multiplied by", and we need to know which it is. But if that ambiguity were removed, would information still be lost? Because it seems like the distinction between an ordered pair of vectors and a rank 2 tensor is the same as the distinction between a stick and a lever, it's entirely a question of how it's being applied.

  • @juanmanuelpedrosa53
    @juanmanuelpedrosa53 4 ปีที่แล้ว

    Say in 23:43, I'm having dificulties understanding what kind of "shape" the tensor component would have.
    I know that T represents scalars but how do you write (or code) a real case scenario, another vector? n-dimensional matrix?, multiple vectors correspondent to each basis?.
    Could you point me in the right direction to find an example with real values for T?

    • @XylyXylyX
      @XylyXylyX  4 ปีที่แล้ว

      You can't really do what you want. You just have to calculate each component of a tensor individually and organize the components into a data structure that can be addressed for calculation.

    • @juanmanuelpedrosa53
      @juanmanuelpedrosa53 4 ปีที่แล้ว

      @@XylyXylyX that's what I'm asking, what kind of structure would that be?. Would you recommend any book/website with real examples?. Don't get me wrong, I really appreciate your lectures, but it gets to a point that I need to see real calculation happening. I get the algebra but not the real process.

    • @XylyXylyX
      @XylyXylyX  4 ปีที่แล้ว

      @@juanmanuelpedrosa53 Hmm...try the last few videos of "What is General Relativity?" where I solve the Einstein Equation for a black hole. In that lecture we actually calculate the components of the curvature tensor and related tensors.

    • @juanmanuelpedrosa53
      @juanmanuelpedrosa53 4 ปีที่แล้ว

      @@XylyXylyX ok, I'll complete this series and then start with general relativity. Thank you! :-)

  • @deleogun1717
    @deleogun1717 8 ปีที่แล้ว

    are the input vector/covector for the tensor from the same vector space or are part of a vector field?

    • @XylyXylyX
      @XylyXylyX  8 ปีที่แล้ว

      dele bodede They are from the same vector space and associated covector space.

  • @robertbrandywine
    @robertbrandywine 2 ปีที่แล้ว

    In the previous video you said "this" is a real number, pointing at the board. Did you mean T_xy as a whole was a real number or did you mean x and y were real numbers?

    • @XylyXylyX
      @XylyXylyX  2 ปีที่แล้ว +1

      I'm not sure exactly where in the previous video you mean, but from the text of the question I almost certainly was referring to "T_xy" as a whole. That is the component of something, and the components are scalars of the underlying vector space which was probably a real vector space. "x" and "y" should just refer to some orthogonal axes....

    • @robertbrandywine
      @robertbrandywine 2 ปีที่แล้ว +1

      @@XylyXylyX Sorry, it was at 21:49. You were moving your pointer around under the indices as you said "these are just still a bunch of real numbers". Trying to follow, that left me wondering what T itself resolved to.

    • @XylyXylyX
      @XylyXylyX  2 ปีที่แล้ว +1

      @@robertbrandywine Ok….yes T_mu_nu is a real component and mu and nu are just the integer index values.

    • @robertbrandywine
      @robertbrandywine 2 ปีที่แล้ว

      @@XylyXylyX 👍

  • @mohitvarshney1101
    @mohitvarshney1101 7 ปีที่แล้ว

    Hi XylyXylyX, thanks for the great explanation.
    I just had one query, at around 21.45 you said that "this (real number) whole thing has to be expanded against that (coefficient of the tensor, T^u_v) index". However am I correct to say that the real number doesnt need to be expanded against Tuv as there are not 16 coefficients but just 2 as u and v are known to be 0 and 1 ( as e_0 tensor product e^1 was used as the map in the example)? (The Einstein's sum should only take place twice not 16 times)

    • @XylyXylyX
      @XylyXylyX  7 ปีที่แล้ว +1

      If T^\mu_
      u had only one non-zero component T^0_1 then the Einstein sum would only have one term as you say.

  • @zhangjin7179
    @zhangjin7179 6 ปีที่แล้ว

    can you define a cartesian product before a 3 dimension vector space and 4 dimension vector space?

    • @XylyXylyX
      @XylyXylyX  6 ปีที่แล้ว

      zhang jim I do not understand your question.

    • @zhangjin7179
      @zhangjin7179 6 ปีที่แล้ว

      XylyXylyX can you define cartesian product of a 3 d vector space and a 4d vector space? if yes,
      Can you define tensors that maps such a cartesian product into a real number/

    • @XylyXylyX
      @XylyXylyX  6 ปีที่แล้ว

      zhang jim Yes. A Cartesian product will work for ANY two sets.
      Yes, it is easy to define multilinear maps involving different vector spaces. However we don’t call those maps “tensors” but they definitely are possible.

    • @zhangjin7179
      @zhangjin7179 6 ปีที่แล้ว

      XylyXylyX so tensor is only for vector/covector of same dimension? is that part of the definition of tensor in addition to the rules you pointed out in the video... such as v*xvxv is *not* a tensor but vxvxv*xv* is ...

    • @XylyXylyX
      @XylyXylyX  6 ปีที่แล้ว

      zhang jim Yes, that is correct. “Tensors” all involve V and V* only. Also, a “Tensor” must have a rank given by (p,q) which means that all of the vector are before all of the covectors. However, this is a rather technical matter which is not particularly enforced. That is, we often call V*@V a “tensor” anyway.

  • @drlangattx3dotnet
    @drlangattx3dotnet 7 ปีที่แล้ว

    I am still confused when I read your inserted correction. Can you help me with an answer? Thanks.

    • @XylyXylyX
      @XylyXylyX  7 ปีที่แล้ว

      It is a mess....the left side identifies the mappings. The right side identifies the thing that is the actual map. To take vectors to real numbers requires covectors which have superscripts. I wrote subscripts " e_\alpha" for example. To take covectors to real numbers you need vectors which have subscripts. I wrote superscripts "e^\beta" for example.
      The components, of course, have the *opposite* problem!

  • @garytzehaylau9432
    @garytzehaylau9432 5 ปีที่แล้ว

    in these several video
    you always say there is no correspondence between the V* and V space
    what do you actually mean
    i dont get it actually
    it is a mapping from V* : V --> R and there is a correspondence?
    thank you

  • @h2oplusplus
    @h2oplusplus 8 ปีที่แล้ว

    Thank you so much for the great lecture on this topic. Actually I want to learn more about tensor for understanding General relativity. In your lecture, you construct the tensor space from two identical dual spaces V*, would it be arbitrary to take any pair of vector space for the tensor product, like V oX W?

    • @XylyXylyX
      @XylyXylyX  8 ปีที่แล้ว +1

      sun fai that is allowed, yes. An object like V*@ W, say, would take a vector from W and return a vector from W .... or.... take a vector from V and a covector from W* and return a real number. No problem!

    • @h2oplusplus
      @h2oplusplus 8 ปีที่แล้ว

      XylyXylyX , in that case, we could freely construct a tensor product space from any arbitrary vector spaces, then it is very likely there are no physical meaning attached to, only a pure mathematical object? Right?

    • @XylyXylyX
      @XylyXylyX  8 ปีที่แล้ว

      sun fai All mathematical objects, at best, only serve to *model* physics. It turns out that the physics of space and time are best modeled by tensors. So even though we can make many many different types of tensors using many different vector spaces, we are always faced with the problem of figuring out which ones are best to model nature. For example, any (0,2) tensor can be used as a metric for a manifold, but nature only uses symmetric metrics which happen to solve the Einstein equation, as far as we know. There are limitless numbers of differential equations which we can imagine, but only one of them models spacetime. Do we say that "differential equations have no physical meaning?" Well....many of them *don't* and a few of them do. An even harder question is whether or not *all* physics can be modeled well using mathematical objects. I have personally wondered if there is some limit to this, but I don't think about it much :).

    • @h2oplusplus
      @h2oplusplus 8 ปีที่แล้ว +1

      XylyXylyX thanks for your reply, I agree with your comment about modelling physics, personally, sometime it would be easier to understand some abstract mathematical idea if they have some associated physical meaning. Perhaps, like divergent series, no obvious phy. meaning attached, but still very attractive! Or I should say, this is the intrinsic beauty of maths. Thanks anyway, can't wait to finish your remaining lectures! :)

  • @zhangjin7179
    @zhangjin7179 6 ปีที่แล้ว

    cartesian product of vectors doesn't form a vector spaces right?

    • @XylyXylyX
      @XylyXylyX  6 ปีที่แล้ว

      zhang jim A Cartesian product of two vector spaces is NOT a vector space because the Cartesian product does not automatically have a vector sum and scalar product defined. The Direct Sum is very similar to the Cartesian product, but it includes a vector addition rule and a scalar multiplication rule.

  • @toxicore1190
    @toxicore1190 8 ปีที่แล้ว

    are tensors matrices then? (just wondered)

    • @XylyXylyX
      @XylyXylyX  8 ปีที่แล้ว +1

      Toxi Core There is a formalism regarding tensors that treats all vectors as column matrices and all dual vectors as row matrices and rank (1,1) tensors as a matrix. However this approach is very limited. So, no, tensors are not matrices. Tensors are mappings that live in a vector space called the "tensor product space" as demonstrated in these lessons.

    • @toxicore1190
      @toxicore1190 8 ปีที่แล้ว

      Ok thats what I thought about. Thank you for answering!
      nice videos :)

    • @XylyXylyX
      @XylyXylyX  8 ปีที่แล้ว

      Toxi Core I should point out that much of the formalism of tensors can be developed using matrices, but you must be willing to consider a "matrix" to sometimes mean, for example, a column vector where every entry is itself a row vector. That is how a (0,2) rank tensor might be represented. It is a bit of a mess, but it can be done.

    • @paras3681
      @paras3681 4 ปีที่แล้ว

      @@XylyXylyX I started studying general relativity from a book 'A most incomprehensible thing' , the way it started teaching tensors is using this formalism of tensors being considered as matrices. Am I learning the right way?
      P.S Thanks for this amazing series. You are a hero !

    • @XylyXylyX
      @XylyXylyX  4 ปีที่แล้ว +1

      Paras I don’t have the book, but I certainly don’t like tensors as matrices. However, it certainly can be done and my preferences should not drive you away. GR is not “incomprehensible” in my opinion so I don’t even like the title. There is NO harm in learning from any book. You learn by *comparing* different approaches. At least, that worked for me. Good luck!

  • @surajdevadasshankar5270
    @surajdevadasshankar5270 4 ปีที่แล้ว

    I've got a question: why is it that only two covariants make up a tensor with the tensor product? why can't a single covector from the dual space work as a tensor?

    • @XylyXylyX
      @XylyXylyX  4 ปีที่แล้ว

      A single covector is a rank (0,1) tensor. A single vector is a rank (1,0) tensor. Two covectors can build a rank (0,2) tensor. Two vectors can build a rank (2,0) tensor. Does that help? What timestamp in the lecture is confusing to you?

    • @surajdevadasshankar5270
      @surajdevadasshankar5270 4 ปีที่แล้ว

      @@XylyXylyX sorry if I'm getting this wrong; so if I were mapping a vector of rank 1 to a real number, I'd use a tensor of rank 1? I.e.; only a single covector from the dual space would serve as the tensor. Am I wrong?

    • @XylyXylyX
      @XylyXylyX  4 ปีที่แล้ว

      suraj devadas shankar That is correct. THe only quibble I have is with the word “rank”. Often rank is used to mean the total number of vectors and covectors combined to make a tensor. But in this class we define the rank as an ordered pair so V* @ V* is a (0,2) tensor V @ V is a (2,0) tensor, and V @ V* is a (1,1) tensor. Some would call the first two “Rank 2 tensors and they would not have a rank defined for mixed tensors. So, to your question, V* is a (0,1) tensor that maps an element of V to R.

    • @surajdevadasshankar5270
      @surajdevadasshankar5270 4 ปีที่แล้ว

      @@XylyXylyX great. That clears it up. Thank you! I just wanna say, I've just completed my bachelor's and I still understand your lectures inspite of never coming accross tensors in my life before. Thank you and keep up the great work! :D

  • @bhavyasingh355
    @bhavyasingh355 4 ปีที่แล้ว

    I got very confused in this lecture and after reading the comments it was completely a mess. Can you let me know what the problem is.
    By the way thank you for your great lectures

    • @XylyXylyX
      @XylyXylyX  4 ปีที่แล้ว +1

      Thank you for your kind comment, despite your stuggle with this lecture. This lecture and the one on tensor contraction are my two most problematic lectures. Can you tell me exactly what the confusion is, then maybe I can address it. What I am trying to do here is actually construct a special kind of vector space called a “Tensor Product Space”. I am doing this by constructing objects called. “Tensors” that map elements of a Cartesian Product space of vectors and covectors to real numbers. It is worth a deep understanding of everything in this lecture, or all that follows is hopeless!

    • @bhavyasingh355
      @bhavyasingh355 4 ปีที่แล้ว

      I think around 15:23 you have used superscript for vectors instead of subscript and vice versa.

  • @DrIlyas-sq7pz
    @DrIlyas-sq7pz 8 ปีที่แล้ว

    dear XylyXylyX. cartesian product of two sets is order pair. I think in the way that if one of the sets is one space and cartesian produdt with other give a point in the space with higher dimension. But i wonder what VcrossV will be. can we just consider one V as a set and Vcross V a ordered pair. What would be the sense of that. What is cartesian product of whole vector space is?

    • @XylyXylyX
      @XylyXylyX  8 ปีที่แล้ว

      m ilyas V x V is the set of all ordered pairs where the first member of the pair is a point from V and the second member is also a point from V. If x and y are both members of V then (x,y) is a member of the set V x V.

    • @DrIlyas-sq7pz
      @DrIlyas-sq7pz 8 ปีที่แล้ว

      Thank you very much. Would you mind if I ask one more thing. Is there easy way to make sense of V x V and V@V in on some manifold instead of just abstract mathematics. I am physicist and to make idea clear, I always try to think of some physical example.

    • @XylyXylyX
      @XylyXylyX  8 ปีที่แล้ว +1

      m ilyas I don't think so. We will show how electrodynamics can be understood as "n-forms" soon, and there is some visualization there, but those pictures are unreliable, in my opinion, and they are not for all tensors, just n-forms. I would say you, as a physicist, should embrace the abstract math. If you said "I'm an engineer" I wouldn't ask you to do this, but as physicist we must think more abstractly. Might as well start now! :)

  • @debendragurung3033
    @debendragurung3033 6 ปีที่แล้ว

    when you say a Tensor Product, (BxA) of the Dual Vector Space V*xV, that maps a pair of Elements of the Vector Space to Real, What kind of operation is involved after inner product. Is it (B,A)(v,w) = (B.v) (A.w) OR is it (B,v)+(A,w)

    • @XylyXylyX
      @XylyXylyX  6 ปีที่แล้ว

      debendra gurung Hmmm....I’m not sure your question is formed correctly. The tensor product space that maps two vectors to R is V*xV* not V*xV. If an element of V*xV* , call it A x B , is fed two vectors v and w the (A x B) (v,w) = < A, v >

  • @HotPepperLala
    @HotPepperLala 7 ปีที่แล้ว

    10:00 what you mean v:V* ->R ?? A vector v \in V, its not a map?? I think you could do an example of V (x) V

    • @XylyXylyX
      @XylyXylyX  7 ปีที่แล้ว

      rinwhr it is important to understand that the dual of a covector is a vector so vectors are maps taking covectors to the real numbers.

    • @HotPepperLala
      @HotPepperLala 7 ปีที่แล้ว

      I think we are in the double dual right now, so I think what you really meant is to write the map v -> g_v, where g_v is an element of the double dual V^** (which you called a "vector"). So here g_v(f) = f(v) where f is in the V^*. I'll leave this up if anyone got confused.

  • @no-one-in-particular
    @no-one-in-particular ปีที่แล้ว

    Again, the set of tensor products is not a vector space. You need all linear combinations of products, i.e. the set which is SPANNED by tensor products. The video starts with an incorrect definition of tensor product space. If what you are saying is true there would be no quantum entanglement.

    • @XylyXylyX
      @XylyXylyX  ปีที่แล้ว

      I think at this point in the lesson series the students know this. I am certainly not suggesting that a vector space has only a finite number of elements.

    • @no-one-in-particular
      @no-one-in-particular ปีที่แล้ว

      @@XylyXylyX I don't think you understood my point, I didn't say anything about a finite number of elements. You claim the vector space is formed of all elements like beta tensor gamma with any choice of beta or gamma. This is not true. These are the "simple" or "pure" tensors and do not by themselves form a vector space. You need to include all linear combinations of elements of the form beta tensor gamma. In general the sum of two tensor products is not another tensor product.

    • @XylyXylyX
      @XylyXylyX  ปีที่แล้ว

      @@no-one-in-particular Yes, i know that point was addressed fully in the lectures on vector spaces, that is certainly what I meant. I really don’t think that particular gaffe was confusing to anyone however. I have had plenty of real serious gaffes, though, so thank you for pointing this out.I wasn’t clear on exactly what aspect of it was confusing to you….I though you implied that i was saying a set of just two members was a vector space. I am quite sure that we all know that all linear combinations are to be included. This lesson is WAy past that point.

  • @jvcmarc
    @jvcmarc 6 ปีที่แล้ว +2

    You make maths boring
    generalize too much, run over concepts too quickly, give no examples of usages and don't connect it to other knowledge we might have (such as in physics), I know you want to abstract it out, but it gets too overwhelming with all letters going on in this 16 dimensional space, it's easy too get lost, especialy for someone who is starting tensors from your videos

    • @XylyXylyX
      @XylyXylyX  6 ปีที่แล้ว +1

      Yes, I do maintain an entirely abstract approach. There are many good texts to get you started on this subject. These lectures are not for beginners, they are for ppl who want to go over this material a *second* time actually. Good Luck!

    • @moonglaive
      @moonglaive 5 ปีที่แล้ว

      It's really helpful for those of us who don't have any prior explanation of what the hell is going on with the notation as well.

  • @darkknight3305
    @darkknight3305 6 หลายเดือนก่อน

    Thank you so much

  • @mohamedmoussa9635
    @mohamedmoussa9635 2 ปีที่แล้ว

    Can you have a tensor product of completely different vector spaces? e.g. elements of V*⊗W* which map V x W -> ℝ

    • @XylyXylyX
      @XylyXylyX  2 ปีที่แล้ว

      Yes! Not done a lot in elementary GR, but mathematically it is fine.

    • @mohamedmoussa9635
      @mohamedmoussa9635 2 ปีที่แล้ว

      @@XylyXylyX I was thinking in continuum mechanics, you have the initial and deformed configurations of the body. The gradient of the mapping between them would be one of these tensors I think.

    • @XylyXylyX
      @XylyXylyX  2 ปีที่แล้ว

      @@mohamedmoussa9635 Are those really two different vector spaces?

    • @mohamedmoussa9635
      @mohamedmoussa9635 2 ปีที่แล้ว

      @@XylyXylyX I think so, would these not be tangent spaces on different manifolds?