What is a tensor anyway?? (from a mathematician)

แชร์
ฝัง
  • เผยแพร่เมื่อ 27 ธ.ค. 2024

ความคิดเห็น • 661

  • @qm_turtle
    @qm_turtle 3 ปีที่แล้ว +962

    I always loved the definition for tensors in many physics textbooks: "a tensor is a mathematical object that transforms like a tensor".

    • @Erik_Caballero
      @Erik_Caballero 3 ปีที่แล้ว +150

      Wait! You mean to say that the floor here is made of floor?!

    • @qm_turtle
      @qm_turtle 3 ปีที่แล้ว +34

      @@Erik_Caballero precisely that

    • @theadamabrams
      @theadamabrams 3 ปีที่แล้ว +161

      I mean, a "vector" in mathematics is "an element of a vector space", so that tensor definition seems perfect to me.

    • @iheartalgebra
      @iheartalgebra 3 ปีที่แล้ว +109

      @@theadamabrams And let's not forget that a "vector space" in turn is simply a "space consisting of vectors" xD

    • @jankriz9199
      @jankriz9199 3 ปีที่แล้ว +17

      my friend used to always say aloud when heard this definition "ok boomer" :D

  • @AndrewDotsonvideos
    @AndrewDotsonvideos 3 ปีที่แล้ว +603

    I'm gonna have to come back to this one for sure

    • @vinvic1578
      @vinvic1578 3 ปีที่แล้ว +22

      Hey Andrew so cool to see you here! You should cover this in a drink and derive video ;)

    • @cardboardhero9950
      @cardboardhero9950 3 ปีที่แล้ว +1

      ayyy smart people

    • @axelperezmachado3500
      @axelperezmachado3500 3 ปีที่แล้ว +33

      Wait. Wasn't a tensor just something that transforms like a tensor? We have been lied!

    • @captainsnake8515
      @captainsnake8515 3 ปีที่แล้ว +5

      @@axelperezmachado3500 physicists have subtly different definitions for vectors and tensors than mathematicians do.

    • @rickdoesmath3945
      @rickdoesmath3945 3 ปีที่แล้ว

      By the way i loved the video on motivation

  • @hensoramhunsiyar3431
    @hensoramhunsiyar3431 3 ปีที่แล้ว +297

    Great exposition. A video on how this concept is related to the concept of tensors in physics and other fields would be great.

    • @k-theory8604
      @k-theory8604 3 ปีที่แล้ว +6

      Hey if spamming videos in the comments isn't allowed, I'm happy to not do it in the future, but I just made a short video about this on my channel.

    • @poproporpo
      @poproporpo 3 ปีที่แล้ว +7

      @@k-theory8604 if it is related to the topic at hand, it’s not really shameless advertisement and we appreciate the reference. In this case it’s fine.

    • @athens9k
      @athens9k 3 ปีที่แล้ว +21

      Tensors as defined in this video are objects defined over a vector space. Tensors used in physics are really tensor fields on a manifold M. For every point p on a manifold, we have a natural vector space given by the tangent space TpM. On each tangent space, we can define some sort of tensor. A physicist's tensor is a smoothly varying field of tensors.
      A nice fact about manifolds is that they can be locally parameterized by Euclidean space. These parameterizations are usually called charts. We can use charts to parameterize all tangent spaces within a neighborhood of any given point. This allows us to write down the components of a smoothly varying field of tensors using the coordinates given by the chart.
      Then comes the "tensor transformation law." If some point p sits in the intersection of two different charts, we have two sets of local coordinates coming from each chart. The tensor transformation law simply says that changing from one set of coordinates A to the other set of coordinates B should change the components of the tensor in such a way that the local description of the tensor in the B coordinates matches the transformed version of the tensor in the A coordinates. You can think about this compatability condition as gluing the local descriptions of the tensor together to form a global tensor field on the whole manifold.
      The concepts are related but not identical. To summarize: a mathematician's tensor is an object defined over a vector space, a physicist's tensor is a field of mathematician's tensors on a manifold.

    • @mikhailmikhailov8781
      @mikhailmikhailov8781 3 ปีที่แล้ว +1

      Its the same thing, this is just the definition.

    • @mikhailmikhailov8781
      @mikhailmikhailov8781 3 ปีที่แล้ว

      It is related by being the same fucking concept.

  • @cicciocareri85
    @cicciocareri85 3 ปีที่แล้ว +166

    I'm an engineer and I've used tensors and dyadic product in mathematical physics, electromagnetism and continuum mechanics. I'd like to see more of this!

    • @snnwstt
      @snnwstt 3 ปีที่แล้ว

      I second that.
      Maybe with implication of a base change for the vectors (like cartesian to oblique, or polar (2D), or curved). Sure that will be even greater with derivatives (first and second order).

    • @mastershooter64
      @mastershooter64 3 ปีที่แล้ว +3

      lol I thought the most advanced math engineers used was basic math like vector calc and linear algebra and some integral transforms

    • @cicciocareri85
      @cicciocareri85 3 ปีที่แล้ว +1

      @@mastershooter64 it depends where you study :-)

    • @mastershooter64
      @mastershooter64 3 ปีที่แล้ว

      @@cicciocareri85 which field of engineering uses the most advanced math?

    • @snnwstt
      @snnwstt 3 ปีที่แล้ว

      @@mastershooter64 Finite elements technique often use simple "reference elements" which are then ... tortured. Depending of the desired "continuity", the numerical derivatives implied by the (partial) differential equation(s) borrow heavily from tensor theory.

  • @TIMS3O
    @TIMS3O 3 ปีที่แล้ว +25

    Really nice video. One thing that is worth mentioning is that the reason why one wants the relations mentioned is that they are the relations needed to turn bilinear maps into linear maps. This essentialy reduces the study of bilinear maps to the study of linear maps which one already knows from linear algebra. This property of turning bilinear maps to linear maps is called the universal property of the tensor product.

    • @schmud68
      @schmud68 3 ปีที่แล้ว +2

      Yeah, I thought this was very important too

  • @alonamaloh
    @alonamaloh 3 ปีที่แล้ว +17

    This is exactly how the tensor product was introduced to me in my first year Linear Algebra class in college, but I think your exposition is more clear than what I remember getting at school. At the time I couldn't understand what this was about at all. The definition through the universal property regarding bilinear maps made a lot more sense to me. You may want to mention it if you make further videos on this subject.

  • @buxeessingh2571
    @buxeessingh2571 3 ปีที่แล้ว +96

    I absolutely want to see a video comparing different types of tensors. I got confused in a graduate algebra class and a graduate geometry class because I was first exposed to tensors in mechanical engineering and then in modern physics.

    • @JM-us3fr
      @JM-us3fr 3 ปีที่แล้ว +9

      My understanding is that the tensors in physics are just these simple tensors that Michael is talking about, but using its matrix representation instead. But just like you can talk about a whole matrix (a_ij) by only considering a single general entry a_ij, physicists prefer to only talk about a single entry in their tensors, which may also require superscripts.

    • @TheElCogno
      @TheElCogno 3 ปีที่แล้ว +6

      I think tensors in physics, at least in general relativity, are really tensor fields from differential geometry.

    • @rtg_onefourtwoeightfiveseven
      @rtg_onefourtwoeightfiveseven 3 ปีที่แล้ว

      ​@@JM-us3fr AFAIK in physics there's an explicit restriction with how tensors transform under coordinate transformations. Is this restriction also implicit in the mathematical definition, or no?

    • @JM-us3fr
      @JM-us3fr 3 ปีที่แล้ว

      @@rtg_onefourtwoeightfiveseven I’m not sure if these are the same, but I know when you want to change the basis of your vector space, and you want to represent an alternating tensor with respect to the new basis, then you can just multiply the 1st representation by the determinant of the basis transformation matrix.
      But you’re probably referring to the reference frame transformation in general relativity. I’m not quite sure how this works.

    • @schmud68
      @schmud68 3 ปีที่แล้ว +9

      @@rtg_onefourtwoeightfiveseven Yes it is implicit in the mathematical definition. You start with a differentiable manifold which has local coordinate charts, but you define tensor fields on the manifold independently of coordinate charts. So, they don't depend on the choice of coordinates. Let us work point-wise on the manifold. When you see the tensors in GR, what you are looking at is a choice of coordinate chart which determines a basis for the tangent/cotangent spaces (these are vector spaces associated to points on the manifold), which then allows you to express tensors as linear combinations of tensor products of these basis vectors. The coefficients of these linear combinations are the coefficients you see in Einstein notation in GR; these are, mathematically, NOT tensors.
      Now, I said the tensors themselves are completely coordinate independent. What this means is that the basis vectors (determined by a coordinate transformation) will transform inversely to the coefficients so that they cancel out. This is where the Jacobian/inverse Jacobian come in for the coordinate transformations. Finally, the upper and lower indices of tensors really refer to the tangent and cotangent spaces, where the cotangent space is the dual of the tangent space. A rank (1,1) tensor is built from a linear of combination of tangent vectors tensor producted with cotangent vectors. Now, if the manifold has a Riemannian metric, then it can be used to provide a canonical isomorphism between tangent and cotangent spaces (given by raising/lowering indices with the metric in GR notation).
      Hopefully this gives some ideas. There is quite a lot going on and I have swept some things under the rug, but if you want to formalise this further you need: vector bundles, tensors products of vector bundles, sections of bundles, and universal property of tensor products

  • @josephmellor7641
    @josephmellor7641 3 ปีที่แล้ว +5

    I wanted to find this video, but I forgot your name for a second, so I searched for "tensors good place to stop." This video was the first result.

  • @glenm99
    @glenm99 3 ปีที่แล้ว +2

    This is incidental to the main topic, but I love the way that's worded at 15:36 ... "But now we can extract something from I, at no cost, because something from I is deemed as being equal to zero in the quotient."
    When I took algebra (so many years ago), despite knowing the statements of and being able to prove the theorems, I struggled for a long time with the practical application of this idea. And then, when you put it like that... it's so simple!

  • @Handelsbilanzdefizit
    @Handelsbilanzdefizit 3 ปีที่แล้ว +93

    MachineLearning-Student: "Hey, look at my tensors!"
    Physics-Student: "No, better not. But look at my tensors!"
    Maths-Student: "No, better not."

    • @thenateman27
      @thenateman27 3 ปีที่แล้ว +6

      Don't fear my Levi-Civita operator and my Electromagnetic tensor

    • @mikhailmikhailov8781
      @mikhailmikhailov8781 3 ปีที่แล้ว +1

      @@thenateman27 Electromagnetic 2-form*

    • @minimo3631
      @minimo3631 4 หลายเดือนก่อน

      ​​@@mikhailmikhailov8781quotient rings of antisymetric tensors, exterior algebra of the cotangent space, potato potahto...

  • @sslelgamal5206
    @sslelgamal5206 3 ปีที่แล้ว +14

    Nice, thank you! All the guide on internet only look at it through physicist POV, an abstract mathematical construction was missing. I hope this series continues 👍👌

  • @dhaka_mathematical_school
    @dhaka_mathematical_school 2 ปีที่แล้ว +12

    This lecture is insanely beautiful. Please prepare/make a couple more videos on tensor products: (1) Tensors as multilinear operators, (2) tensor component transformation (covariant and contravariant), and (3) tensors as invariant/geometric objects. Thanks so much.

    • @АндрейДенькевич
      @АндрейДенькевич 2 ปีที่แล้ว

      @rachmondhoward2125 Insanely beautiful construction example.
      Let's (A)(D)simplicyan be positional natural A-ary SUM(D)-digit number system.
      where A,D -some natural number sequences arbitrary, but equal length.
      Then:
      d-vertex simplex is a (2)(d)simplicyan.
      square is a (3)(2)simplicyan.
      cube is a (3)(3)simplicyan.
      d-dimensional (n-1)*2 -gon is a (n)(d)simplicuan
      pentagon is a (11)(1)simplicyan.
      d-dimensional (n-1)/2 -gon is a (n)(d-1)simplicuan
      Maya/Egypt pyramid is a (2,3)(1,2)simplicyan,
      let's enumerate it:
      100 is a vertex above square.
      022 (=NOT 100) is opposite edge
      020 002 is are 2vertices between wich lies edge 022=002+020.
      120 (=100+020) is edge between verices 100 and 020.
      102 (=100+002) is edge between verices 100 and 002.
      122 (100+020+002) is infinity(content) located between 3 vertices 100 020 002 and 3 edges 120 102 022.
      from that moment we have 2 variants:
      a)vertex 001 connected to vertex 020.(i choose it)
      b)vertex 001 connected to vertex 002.
      001 (=100+100) is vertex connected to vertex 100.
      021 (=001+020) is edge between vertices 001 and 020.
      010 (=001+001) is vertex connected to vertex 001.
      011 (=001+010) is edge between vertices 001 and 010.
      012 (=002+010) is edge between vertices 002 and 010.
      101 (=001+100) is edge between vertices 001 and 100.
      110 (=100+010) is edge between vertices 100 and 010.
      111 (=100+010+001) is triangle between vertices 100 010 001.
      121 (=100+020+001) is triangle between vertices 100 020 001.
      112 (=100+010+002) is triangle between vertices 100 010 002.
      000 is zero is externity of pyramid .
      As you see:
      1. infinity(content) is like triangle.
      2. square doesn't exists, sguare + volume above it upto vertex 100 is Zero.

    • @Mr.Not_Sure
      @Mr.Not_Sure ปีที่แล้ว

      @@АндрейДенькевич Your construction has numerous mistakes and inconsistencies:
      1. You count interior of a k-simplex, but you don't count interiors of a square and cube.
      2. Compared with (n-1)/2-gon, your (n-1)*2-gon has 4 times more "gon"s (whatever they are), but its corresponding simplician has n times more indices.
      3. Equation "001=100+100" is false.
      4. And what about dodecahedron? It's 3-dimensional 12-hedron with 30 edges, but doesn't correspond neither to (7)(3) nor (16)(3) simplician.

    • @АндрейДенькевич
      @АндрейДенькевич ปีที่แล้ว

      @@Mr.Not_Sure you are right. In equation "001=100+100" i mean that carry rotate.
      Preface. Positional natural a-ary (az^m) d-digit number systems can represent some kind of d-dimensional polytopes in geometry.
      If a=z^m then a-ary d-digit number system equals z-ary (m*d)-digit number system, because z^m^d=z^(m*d).
      For example:
      binary d-digit number system is a d-vertex simplex.(vertices is a numbers with digital root=1, edges is a numbers with digital root=2 and so on).
      3-ary d-digit number system is a d-cube. Let d=2: if 1-chain (2 vertices + 1 edge) shift 1 times we receive 2-cube with 3^2=9 faces, i.e. square.
      6-ary d-digit number system is a d-thorus. Let d=2: If 3-ring (1D triangle) shift in 3D 3 times and "press"(glue) first and last 1D triangles we recieve 2-thorus with 6^2=36 faces = 3*3 square + (3+3)*3 edges + 3*3 vertices.
      Conclusion.
      All positional natural a-ary (az^m) d-digit number systems with a^d numbers are represented by 3 types of d-polytopes (simplex (binary), cube (odd-ary) , thorus (even-ary) ) with a^d faces . For simplex d means amount of vertex or hyperplanes, for any other polytopes dimension.
      For me as a programmer, it's curious to know that difference in faces between consequent such polytopes is hexagonal numbers.
      To enumerate more complex polytope may be used positional natural A-ary D-digit number system , (A)(D)simplician.
      Where A,D - some natural number sequence arbitrary but equal length. Then 2D Maya/Egypt pyramid is are positional natural (2,3^2)-ary (1,1)-digit number system or (2,3^2)(1,1)simplician. digital root of (1,1)=2 is dimension of pyramid.
      10 is a vertex above square.
      08 (=NOT 10) is one of opposite edges.
      06 02 is are 2 vertices between which lies edge 08=02+06.
      16 (=10+06) is edge between vertices 10 and 06.
      12 (=10+02) is edge between vertices 10 and 02.
      18 (10+06+02) is triangle, infinity(content) located between 3 vertices 10 06 02 and 3 edges 16 12 08.
      01 (=10+10) is vertex connected to vertex 10.
      07 (=01+06) is edge between vertices 01 and 06.
      03 (=01+opposite vertex 02) is vertex connected to vertex 01 and 02 instead of absent edge between 01 and 02.(exeption from rule?).
      04 (=01+03) is edge between vertices 01 and 03.
      05 (=02+03) is edge between vertices 02 and 03.
      11 (=01+10) is edge between vertices 01 and 10.
      13 (=10+03) is edge between vertices 10 and 03.
      14 (=10+03+01) is triangle between vertices 10 03 01.
      17 (=10+06+01) is triangle between vertices 10 06 01.
      15 (=10+03+02) is triangle between vertices 10 03 02.
      00 is square(zero).
      18 faces=5 vertices+8 edges+4 triangles+1 square=3^2*2^1 numbers.
      2D dodecahedron has 12 pentagons+30 edges+20 vertices=62 faces= 31^1*2^1 numbers of (2,31)(1,1)simplician (even thorus-like 2-polytope).
      3D dodecahedron has 62+I=63 faces =7*3*3 numbers of (7,3)(1,2)simplician (odd cube-like 3-polytope).
      Simplex differs from any other polytope. Simplex is only "not preesed" polytope.
      Outside and Inside parts of this polytope is considered by me to be faces. As result simplex has 2^d faces (no 2^d-1).
      If to increment dimension they became from potentional to real faces. Outside and Inside parts both are counted.
      In other polytopes 1 or 1/2 of dimension is "pressed"("curved") in variouse ways.
      For example in even(thorus-like) 2-dimensional in 3D (no 3-dimensional) M aya/Egypt pyramid Outside part of "pressed" dimension is square, Inside part is triangle,
      so both Outside and Inside parts (interior and exterior) are not counted.
      In odd (cube-like, in A odd numbers only) polytope only Outside part (zero) is pressed in one of hyperplanes, so Outside part is not counted, only interior is counted, .
      And so on.
      "No distinction between numbers and shape. Numbers could not exist without shape." Pythagoras (reincarnation of Euphorbos).

    • @АндрейДенькевич
      @АндрейДенькевич ปีที่แล้ว

      @@Mr.Not_Sure Lets enumerate 3D dodecahedron ,wich is (7,3)(1,2)simplician with 7^1*3^2=63 faces.
      It's evident:
      1. (m+n,..)(1,..)=(m,..)(1,..)+(n,..)(1,..) because of (m+n)^1=m^1+n^1.
      2. (m,..)(i+j,..)=(m,..)(i,..)*(m,..)(j,..) because of m^(i+j)=m^i*m^j.
      3. (m,m,..)(i,j,..)=(m,..)(i+j,..) because of m^i*m^j=m^(i+j).
      4. (m^n,..)(i,..)=(m,..)(n*i,..) because of m^n^i=m^(n*i).
      5. (m,n,..)(i,i,..)=(m*n,..)(i,..) because of m^i*n^i=(m*n)^i.
      6. ( m ,1,..)(i,j,..)=(m..)(i,..) because of m^i*1^j=m^i..
      Use 1..4: (7,3)(1,2)=(4+3,3)(1,2)=(4,3)(1,2)+(3,3)(1,2)=(2,3)(2,2)+(3)(3)=(6)(2)+(3)(3);
      So 3D dodecahedron equals 3D cube (3)(3) + 2D thor (6)(2).
      Lets enumerate 1D chain (3)(1). It's easy.
      Let edge be 2 because 2 is maximal number(infinity)
      Let rest two vertices be 0 1.
      Lets enumerate 2D square (3)(2)=(3)(1)*(3)(1). It's easy.
      Let 2D chain in center be located in 2 of first digit.
      Let rest two 1D chains be located in 0 1 of first digit.
      Lets enumerate 3D cube (3)(3)=(3)(1)*(3)(2). It's easy.
      Let 3D chain in center be located in 2 of first digit.
      Let rest two 2D chains be located in 0 1 of first digit.
      Lets enumerate 1D ring(triangle) (6)(1)=(3)(1)+(3)(1). It's easy.
      Let 1D chain be located in 0..2 of digit
      Let rest inversed 1D chain located in 3..5 of digit.
      Lets enumerate 2D thor (6)(2)=(6)(1)*(6)(1). It's easy.
      Let 3 1D rings be located in 0..2 of first digit
      Let 3 2D rings be located in 3..5 of first digit.
      Lets enumerate 3D dodecahedron (7,3)(1,2)=(6)(2)+(3)(3). It's easy.
      Let 3D cube (3)(3) be located in 0..2 of first digit unmodified as above.
      Let 2D thor (6)(2) be located in 3..6 of first digit in such way:
      (6)(2)=(4,3)(1,2)=(2,3)(1,2)+(2,3)(1,2)= 2D maya/egypt pyramid +2D maya/egypt pyramid.
      Let 2D maya/egypt pyramid be located in 3..4 of first digit
      Let 2D maya/egypt pyramid be located in 5..6 of first digit
      At last we must "press" 6 vertex of cube to triangles.
      As result 6 vertex of cube became interiors of 6 hedrons.
      6 rings of thor are added to close this 6 new interiors.
      We receive 12 hedrons=6 old + 6 new.
      30 edges=12 edges of cube + 3*3 edges + 3*3 square of thor
      20 vertices=(8-6) vertices of cube + 3*3 vertices of 3 1D rings +3*3 edges of 3 2D rings of tohr.

  • @somgesomgedus9313
    @somgesomgedus9313 3 ปีที่แล้ว +2

    If this construction scares you, don't worry. It's very technical and just made for having these nice properties we want the tensor product to have. What you need when you work with the tensor product is the universal property that gives you really control over the thing. Like many objects that are constructed for a universal property, the construction is very technical but once you know that it exists you can almost forget the explicit construction and just concentrate on the universal property!
    Great video btw I love it when I see stuff like that on TH-cam!

  • @pablostraub
    @pablostraub 3 ปีที่แล้ว +104

    I would have liked the critical concept of "span" to be explained. Also, at the beginning there was a comment that "obviously" the dimension of the star product would be infinite (I was guessing 5 or 6, but no infinite) so I got lost.

    • @TheElCogno
      @TheElCogno 3 ปีที่แล้ว +51

      The way I understand it, the span of a set is the set of all finite linear combinations of elements of that set, and the formal product is infinite dimensional because any v*w is a basis vector, since it cannot be expressed as a linear combination of other elements of that type.

    • @f5673-t1h
      @f5673-t1h 3 ปีที่แล้ว +25

      The dimension is the maximum number of independent vectors you can have. R^3 has at most 3: for example, [1,0,-9], [0,4,0], [1,2,3]. You put any more vectors, and you get relations between them.
      For the star product, every v*w is independent from every r*s (where either component is different). You have infinitely many options for the first component, then infinitely many options for the second, so the dimension is infinite.

    • @brooksbryant2478
      @brooksbryant2478 3 ปีที่แล้ว +4

      Is the formal product essentially just the Cartesian product or is there a difference I’m missing?

    • @f5673-t1h
      @f5673-t1h 3 ปีที่แล้ว +5

      @@brooksbryant2478 The basis of the star product is just the Cartesian product. But the span is much bigger. You're taking every (finite) linear combination from this basis to make the star product space.

    • @pbroks13
      @pbroks13 3 ปีที่แล้ว +14

      Formal product is basically a “dummy” operation. You basically just stick the events together, but you are not allowed to do anything to them. So, for example, if * is a formal product, you could NOT say that 2*3=3*2 or anything like that. 2*3 and 3*2 are their own separate things.

  • @lexinwonderland5741
    @lexinwonderland5741 3 ปีที่แล้ว +6

    PLEASE post more!! Especially loved how you went through the formal product and coset (although I would like some more background/elaboration there), and I would love to see the other proofs and the isomorphism to R6 you teased at the end!

  • @Julian-ot8cs
    @Julian-ot8cs 3 ปีที่แล้ว +8

    I've been dying to get a good introduction to tensors from a mathematical perspective, and this video really does it for me! This was very fun to watch, and I'd love to see more!!!!

  • @renerpho
    @renerpho 3 ปีที่แล้ว +28

    Yes, more on this, please! Particularly on tensors in physics and how they do (or don't) relate to your formal definition.

    • @MasterHigure
      @MasterHigure 3 ปีที่แล้ว

      The short version is, the "tensors" (letters with upper and lower indices) that appear in physics (at least in general relativity; I know little about, say, material stresses) are the coefficients of the linear combinations we see here. The basis vectors and the tensor product symbol are elided.

    • @k-theory8604
      @k-theory8604 3 ปีที่แล้ว +4

      ​@@MasterHigure No, the tensors are not the coefficients.
      In physics, we are often concerned with vector fields. Essentially, we imagine these as some sort of vector sitting at each point in space, which vary continuously. However, in order to formally define such a thing, one first needs to define the tangents space to a (manifold) space at every point. Then, one thinks about all of these tangent spaces together as forming one large geometric object, called the tangent bundle Even more generally, one takes the tensor products of each tangent space with itself a number of times, and then consider that as one big collection of objects.
      Then, tensor fields in physics are just maps from the underlying space to the (generalized) tangent bundle.
      TL;DR: They are related, but the physics tensor is much more complicated. One needs the definition in this video to define the version from physics.

    • @schmud68
      @schmud68 3 ปีที่แล้ว

      Yeah exactly, then you also need the cotangent bundle and the notion of the universal property of tensor products. Then, after specifying a coordinate chart, you can evaluate everything using the basis provided by the chart and get the usual physics Einstein notation. Quite a journey lol

  • @azeds
    @azeds 3 ปีที่แล้ว +1

    Content. Video quality. Voice
    All this deserves 100/100

  • @Fetrovsky
    @Fetrovsky 3 ปีที่แล้ว +50

    It all makes perfect sense, but it could've helped to define the star operation between bectors, and then what a tensor is and why that name was chosen right around the time it was defined.

    • @kenzou776
      @kenzou776 2 ปีที่แล้ว +1

      What is the star operation between vectors??

    • @mr.es1857
      @mr.es1857 2 ปีที่แล้ว +1

      What is the star operation between vectors?? Please develop

    • @philippg6023
      @philippg6023 2 ปีที่แล้ว +3

      The small star between two vectors is a 'dummy' Operation. So * Maps v and w to v * w wich is an Element in V * W. Therefore the Star is just notation. Then he says all these star products are defined to be linear Independent. Hence all Elements v * w are a Basis of V * W. (This yields for example immediatly, that V*W is Infinite dimensional). I hope this helps you.
      Edit: an example: in the karthesian product (a, b) + (c, d) = (a+c, b+d), however a*b + c*d is not equal to (a+c)*(b+d).

    • @Fetrovsky
      @Fetrovsky 2 ปีที่แล้ว

      @@philippg6023 So, if I get you correctly, there is no "star" operation per se but it's a representation of any linear operation that has certain characteristics. Is that correct?

    • @philippg6023
      @philippg6023 2 ปีที่แล้ว +1

      @@Fetrovsky maybe you mean the correct Thing but you say it a little bit wrong. v * w is Just a symbol, which has No meaning at all. Hence the Star product is called formal product
      If you have a good understanding of Basis, then you might Like this explenation. V*W is a vectorspace with the karthesian product as Basis.

  • @jamma246
    @jamma246 3 ปีที่แล้ว +7

    23:11 the v is missing a subscript j.
    Other comment: when introducting the formal product, I think it's a bit odd to call this a "span". I'd call it the "vector space generated by the set VxW" or the "free vector space product of V and W", or something like that. I tend to think of "spans" as finding the smallest vector subspace of a subset of some larger vector space. What's important here is that the vector space sum and scalar product on VxW is completely irrelevent to the definition of V*W (that only comes back in when defining the subspace I, and the quotient tensor product). I think highlighting this a bit more at this stage (or taking more time on what you mean by the "span") might have made that easier to follow.

  • @Stobber1981
    @Stobber1981 ปีที่แล้ว

    I know it's a year later, but this construction of tensors really helped to back-fill the hackneyed explanations given by engineers and physicists all over YT. I especially appreciated the definition of the R2*R3 basis at the end and its comparison to the matrix space basis. I'd LOVE more takes on tensors from your perspective, please!

  • @Racnive
    @Racnive 3 ปีที่แล้ว +2

    Thank you! This helped demystify tensors for me.
    I appreciate starting from the generic span of infinite possibilities, specifying the basic necessary properties for this object, and showing what emerges from those conditions.

  • @Walczyk
    @Walczyk 2 ปีที่แล้ว +2

    Hell yeah! This will be useful for all of us physicists needing a better understanding of tensor algebra

  • @elramon7038
    @elramon7038 3 ปีที่แล้ว +1

    I just started studying physics and literally everybody is talking about tensors. I luckily stumbled across this video which happened to be really helpful for my understanding. Also if you could do videos on different uses of tensors in different fields, that would be awesome. Keep up the great work!

  • @myt-mat-mil-mit-met-com-trol
    @myt-mat-mil-mit-met-com-trol 3 ปีที่แล้ว +9

    I barely remember the similar approach from the perspective of my introductory notes to multilinear algebra, where category theory style proofs stood before the tensor product formal definition. Despite that I got a really good mark on the subject, I did not really grasped the idea, and how it is related to the way tensors are approached in other disciplines, say General Relativity. At first I promised myself to study the well-known Spivak's book (calculus on manifolds), but I couldn't keep the promise, as I got busy with obligations which, say, never required to fulfill it, neither asked me to use tensors. So now more than twenty years after, barely remembering, books still on shelves, obligations not really changed, hopefully I watch TH-cam videos enough to preserve my reflections alive. So you have just me as a keen viewer.

    • @Evan490BC
      @Evan490BC 3 ปีที่แล้ว +1

      Read Spivak's book even now. It is the most intuitive (to me at least), abstract exposition of tensors, chains, differential forms, etc I could find.

  • @jacobadamczyk3353
    @jacobadamczyk3353 2 ปีที่แล้ว +4

    "If you're in I, in the quotient space you're equal to zero" never thought of cosets/quotients like this but very helpful!

  • @Impatient_Ape
    @Impatient_Ape 5 หลายเดือนก่อน

    Excellent! Your efforts to emphasize the difference between the formal products and the cosets built from the quotient really help clear up a lot of what gets frequently omitted in many textbooks -- especially in multiparticle quantum mechanics. Really good job sir!

  • @andreyv3609
    @andreyv3609 3 ปีที่แล้ว

    What a pleasure to finally heave a clear, pure (actual) math exposition on the subject. Good job!

  • @muenstercheese
    @muenstercheese 3 ปีที่แล้ว +31

    i’ve only heard about tensors from machine learning, and this is blowing my mind. i love this, thanks for the awesome production

    • @alonamaloh
      @alonamaloh 3 ปีที่แล้ว +6

      I think the word "tensor" in machine learning is closer to the programming notion of "array": The objects described in this video have more structure, in some sense.

    • @AdrianBoyko
      @AdrianBoyko 3 ปีที่แล้ว

      “More structured” or “more constrained”?

    • @alonamaloh
      @alonamaloh 3 ปีที่แล้ว +3

      @@AdrianBoyko I think I meant what I said. Tensors in math are associated with the notion of multilinear functions, but in a neural network that does image recognition, the input can be a "tensor" with dimensions width x height x 3(rgb) x minibatch_size. However, as far as I can tell, that's just an array, without any further structure.

    • @drdca8263
      @drdca8263 2 ปีที่แล้ว

      @@alonamaloh many of the operations done with them are multilinear maps. Specifically, the parts with the parameters that are trained are often multilinear maps with these, with non-linear parts between such multilinear maps.
      This seems to me somewhat tensor-y ?
      Like, yeah, it is all done with a basis chosen for each vector space, but,
      well, you *can* reason about it in a basis-free way ? Like, for an image with 3 color channels, the space of such images can be regarded as the tensor product of the space of greyscale images with the space of colors,
      or as the tensor product of greyscale 1D horizontal images with greyscale 1D vertical images with colors.
      The space of translation-invariant linear maps from greyscale images to greyscale images (of the same size) (so, convolutions) is a vector space, and I think it is the tensor product of the analogous spaces for the spaces of the horizontal and vertical 1D images.
      This all seems like tensors to me?

  • @caldersheagren
    @caldersheagren 3 ปีที่แล้ว +3

    Love this, straight out of my abstract algebra class - although we did tensor modules instead of vector spaces

  • @ed.puckett
    @ed.puckett 3 ปีที่แล้ว +2

    Thank you, your videos are so good. This is my vote for more on this subject.

  • @user_2793
    @user_2793 2 ปีที่แล้ว +1

    Exactly what I need before looking at multiparticle states in QM! Thank you.

  • @kennethvanallen4492
    @kennethvanallen4492 ปีที่แล้ว

    I much prefer the mathematician’s approach to tensors. Thank you for this explanation - it will help me communicate the concept!

  • @stabbysmurf
    @stabbysmurf 3 ปีที่แล้ว

    Thank you for this video. I've tried to learn tensors from mathematical physics texts, and I've had an awful time. It helps a great deal to see a mathematician providing a proper structural definition with a level of precision that I can understand.

  • @maximgoncharov9027
    @maximgoncharov9027 2 ปีที่แล้ว +1

    Nice explanation, friendly for beginners. The best way to define the tensor product, from my point of view (there are at least 4 different definitions). BTW, as a confusing example of non-equal elements from U*V, I would suggest elements 0*w, v*0, 0*0 and 0 (these vectors are different in V*W).

  • @FT029
    @FT029 3 ปีที่แล้ว +10

    Good video! It's really funny how to get desired properties (scalar mult, distributivity), you just quotient out by a bunch of stuff!
    My abstract algebra professor constantly referred to tensor product as a "universal, bilinear map", and all our homework problems only used that definition-- nothing concrete. So it's great to see some examples! I would be curious to see more on the topic.

    •  ปีที่แล้ว

      I think getting desired properties via a quotient is a fairly general technique, isn't it? That's also how you can get complex numbers out of pairs of real numbers. Or how you can get interesting groups out of free groups etc.

  • @christopherlord3441
    @christopherlord3441 ปีที่แล้ว

    This is really great stuff. Approaching tensors from a physics or engineering point of view is very confusing. As a pure mathematics concept it is much clearer. Keep up the good work.

  • @descheleschilder401
    @descheleschilder401 ปีที่แล้ว

    Hi there! I love your way of making things clear! Very clear and in outspoken English! You make each step easy to follow, as if it all comes about very naturally! You're a great teacher! So keep up the good work!

  • @cheeseburger118
    @cheeseburger118 3 ปีที่แล้ว +6

    I've always been really curious what tensors were (never went deeper into math than multivariable calc). This was a little abstract for me, so I'd love to see how they are actually applied!

  • @burkhardstackelberg1203
    @burkhardstackelberg1203 3 ปีที่แล้ว

    Another perspective to look at tensors I like, is the one from multilinear maps. And this video is a good starting point. To every vector space, there exists a dual space of mappings of those vectors to real numbers (or whichever field you choose).
    A direct product of two mapping spaces allows you to map a direct product of vectors to your field, giving you the ability to create a metric of your vector space.
    A direct product of a vector space and its dual space creates very much of what we know as quadratic matrices - mappings of that vector space to itself. But without the need of spelling out components.

  • @RichardTalcott
    @RichardTalcott 3 ปีที่แล้ว +2

    EXCELLENT! Please MORE videos on this & related topics.

  • @TheMauror22
    @TheMauror22 3 ปีที่แล้ว +1

    Please keep these series going! It was very cool!

  • @dominikstepien2000
    @dominikstepien2000 3 ปีที่แล้ว +1

    Great video, I had similar introduction in Differential Forms course, but your explanation was far better than my professor's one. I would love to see more about this. I don't know much about different perspectives on tensor products, but I have heard about usage of them in physics and machine learning and I have no clue how it could be related.

  • @geoffrygifari3377
    @geoffrygifari3377 3 ปีที่แล้ว

    Hi michael, i've had the chance to use tensor product in physics before, and honestly it baffled me. I think you can clear up these confusions:
    1. What do you think are the properties of tensors, from mathematics point of view, that makes it useful in physics?
    2. Do you happen to know why the idea of tensor "transformation" is important at all?
    3. How can we see tensor-vector contraction in mathematics point of view (summing the products of vector and tensor components of equal entries, to get another vector)
    4. Is it correct to say the number of tensor "indices" describes how many vector spaces being combined by tensor product?
    5. Is that last part about isomorphism between tensor product of vector spaces and matrices (also can be built from row vectors) what "representation" is about?
    thank you for your time.

  • @cpiantes
    @cpiantes 3 ปีที่แล้ว +1

    Having to change considering 2(a,b) = (2a,2b) as a distribution operation in a vector space to having (a,b) /= (2a,2b) as essentially an infinite indexing was mind-blowing.

  • @bonsairobo
    @bonsairobo ปีที่แล้ว

    Wow something clicked for me with quotients when you defined the subspace of vectors that you essentially want to equal zero. I had never thought of doing this, but it makes a lot of sense with the fact that elements of the normal subgroup act like the identity when applied to a coset. Very cool.

  • @dchiavez
    @dchiavez 3 ปีที่แล้ว +3

    Great video. I have been studying tensors for continuum mechanics and I am definitely interested. It would be great if you could show the relationship between this more general definition and the other ones i saw (linear mapping, and multilinear mapping).

  • @MasterHigure
    @MasterHigure 3 ปีที่แล้ว +1

    To be strict, if you want your isomorphism between tensor product and matrix space to be "more natural" than the isomorphism with R^6, the 3-vectors should be row vectors. That's the safest way to keep your sanity through any base changes you might encounter, as well as keeping track of how they naturally act with other vectors they might come across. If you keep them both as columns, you should pair the tensor product with 6x1 matrices.
    For those of us who are used to the index notation from differential geometry, and particularly general relativity, this is like not properly keeping track of which indices are upper and which are lower. That's a recipe for disaster.

  • @rickdoesmath3945
    @rickdoesmath3945 3 ปีที่แล้ว +2

    10:50 this always happens in mathematics:
    Analysis: Ok so i have integrals now! I finally can define this seminorm on the set of integrable functions. I hope it's a norm. What the f? The seminorm of this function is 0 but the function is not identically 0 even though it SHOULD BE. Wait not all is lost. I can quotient my set with the equal almost everywhere relation f yeah! This is literally L^1.
    Algebra: Look at this cute integral domain! Let's build his fraction field! I can use ordered pairs. But wait, something is weird. Look: 2/3 and 4/6 SHOULD BE equal but they ain't. Wait not all is lost. I can quotient using the equivalence relation a/b equivalent to c/d when ad = bc. This is literally the smallest extension of an integral domain that is also a field.

  • @bandreon
    @bandreon 3 ปีที่แล้ว

    Brilliant class! Another of those things you should know but did not really.

  • @samuelbam3748
    @samuelbam3748 3 ปีที่แล้ว +3

    Another element which shows the difference between V*W and the tensor product of V and W really good is the vector 0*0 which is just another independent generator but NOT equal to the zero vector of V*W

    • @methatis3013
      @methatis3013 4 วันที่ผ่านมา

      If I get it correctly, the 0 vector in V*W would be 0(v*w), or in other words, any formal product of 2 vectors E multiplied by scalar 0? And formal product 0*0 is not equal to that, but 0(0*0) is

  • @markkennedy9767
    @markkennedy9767 3 ปีที่แล้ว +1

    I would definitely look forward to more stuff on tensors. Your exposition is really good.

  • @bilmoorhead7389
    @bilmoorhead7389 3 ปีที่แล้ว +2

    Yes, please, more on this topic.

  • @ntesla66
    @ntesla66 3 ปีที่แล้ว

    Fascinating. This concept of the quotient space used for the construction of a subspace is entirely new to me. You are a metallurgist and I am but a simple machinist. More please, I want to see the construction of the old one's bones.

  • @perappelgren948
    @perappelgren948 3 ปีที่แล้ว

    @0:25: "...relation of OUR language of tensors (to) those of other fields of mathematics..." 🤣🤣🤣 This really makes me fell special! So many thanks, Prof. P, for undisguising and defusing tensors.

  • @Hank-ry9bz
    @Hank-ry9bz 9 หลายเดือนก่อน

    4:50 bookmark: examples in R2*R3
    10:55 quotient space, 18:15 basis, 24:00 basis example

  • @mathboy8188
    @mathboy8188 2 ปีที่แล้ว +1

    Hear "tensor", think "multi-linear".
    See "tensor", think "coefficients fluidly glide between the factors".

  • @jongraham7362
    @jongraham7362 2 ปีที่แล้ว +1

    Great video... I have been trying to understand tensors for many years... I've seen it from the Physics viewpoint and the Abstract Algebra derivation. This helps, thanks. More would be good.

  • @jmafoko
    @jmafoko 9 หลายเดือนก่อน

    DR Penn always makes it look easy using simple examples. Will like to hear more about the quotient space, i think that is key to this structure. By the way it wil be nice to also tie it to differential forms so that we are closer to apps

  • @wongbob4813
    @wongbob4813 3 ปีที่แล้ว

    More please! Loved this gentle introduction to tensors :)

  • @AlisterMachado
    @AlisterMachado 3 ปีที่แล้ว +1

    This is such a nice view on the Tensor product! Definitely hoping to see more :)

  • @lewisbulled6764
    @lewisbulled6764 3 ปีที่แล้ว +4

    I’ve only ever used tensors in continuum mechanics (most notably the stress tensor, but also strain tensors and deformation-gradient tensors), so it was interesting to see a completely different perspective on them!

  • @GordonKindlmann
    @GordonKindlmann 3 ปีที่แล้ว +1

    Thanks for making this great video. I am very curious how you would present the idea of covariant vs contravariant indices within a tensor.

  • @scollyer.tuition
    @scollyer.tuition 3 ปีที่แล้ว +2

    Very nice explanation. I'd be interested in seeing more along these lines.
    Maybe you could talk about how tensor products convert bilinear maps into linear maps.

  • @noahtaul
    @noahtaul 3 ปีที่แล้ว +2

    I think this could be a good series. If you’re thinking about continuing in a physics sense, or an algebraic geometry sense, I think that would be useful, but I think it would make a good series to put together a set of videos on homological algebra. Exact sequences, hom-tensor adjoint, cohomology/homology of chain complexes, etc.

  • @nosy-cat
    @nosy-cat 2 ปีที่แล้ว

    Beautifully done. Watched this today for the second time, and this time around I feel like I really understand what's going on. Would love to see more on this topic!

  • @LucienOmalley
    @LucienOmalley 3 ปีที่แล้ว

    Beautiful presentation as usual... thanks for your work !

  • @IustinThe_Human
    @IustinThe_Human 2 ปีที่แล้ว

    this video connects for me so many dots, it was amazing

  • @krelly90277
    @krelly90277 3 ปีที่แล้ว

    Thank you for this video. Please consider making more videos on tensor theory.

  • @tannerlawson1017
    @tannerlawson1017 3 ปีที่แล้ว

    I seldom give a like to videos, but this one absolutely deserves it. Thank you!

  • @mr.es1857
    @mr.es1857 2 ปีที่แล้ว +1

    Thanks for your great lecture, thank you everyone for your comments.
    Two questions here :
    What is spanning or span ?
    What is the meaning of the "co-set"?
    BR,
    Tara,

  • @cesarjom
    @cesarjom 3 ปีที่แล้ว

    This is awesome you are exploring this topic of Tensors. Please continue with plans to develop more topics on applications of Tensors, especially in physics and of course general relativity.

  • @FranciscoMNeto
    @FranciscoMNeto 3 ปีที่แล้ว

    Michael doing a series on tensors?!
    HELL YEAH

  • @guyarbel2387
    @guyarbel2387 3 ปีที่แล้ว +1

    What an amazing video ! Thank you so much :)

  • @gnomeba12
    @gnomeba12 3 ปีที่แล้ว

    Yes please. More takes on the tensor product!

  • @evankalis
    @evankalis 3 ปีที่แล้ว +2

    I know id watch more videos on this topic for sure!

  • @keithvanantwerp3198
    @keithvanantwerp3198 3 ปีที่แล้ว

    More please, nice job. Physics/continuum mechanics and then top it off by clarifying the relation to tensors in machine learning.

  • @yf-n7710
    @yf-n7710 ปีที่แล้ว

    Six months ago I would have been lost here, but I actually understood everything in this video!

  • @i_amscarface_the_legend9744
    @i_amscarface_the_legend9744 3 ปีที่แล้ว

    Finally, i understand the meaning of a tensor ! Thank u professor !

  • @Nickelicious7
    @Nickelicious7 3 ปีที่แล้ว +2

    Love this, please do more

  • @nelprincipe
    @nelprincipe 3 ปีที่แล้ว +2

    Indeed, life is better with more tensors in it. Give us moar!

  • @josephgrossenbacher7642
    @josephgrossenbacher7642 3 ปีที่แล้ว

    ... & , finally , physicists begin to understand "tensors" ... !!!
    as usual ,
    very well done , Mike ....

  • @carlosvaldeonrincon799
    @carlosvaldeonrincon799 ปีที่แล้ว

    this is the best channel i know on youtube

  • @kapoioBCS
    @kapoioBCS 3 ปีที่แล้ว +3

    I an looking forward on more videos on more advanced subjects! I just finished a course on representation theory for my master.

  • @jimnewton4534
    @jimnewton4534 ปีที่แล้ว

    Great explanation. I'm looking forward to seeing how your tensor relates to tensors in other fields. In particular in which way does a tensor transform bilinear mappings to linear mappings?

  • @mattc160
    @mattc160 3 ปีที่แล้ว +1

    Please do more of this!!!!

  • @QWERTYUIOPASDFGH2675
    @QWERTYUIOPASDFGH2675 3 ปีที่แล้ว

    I would love to see more of these kinds of videos and on this subject in particular!

  • @Songfugel
    @Songfugel 3 ปีที่แล้ว

    Yes more tensors please. This is one of the topics I personally feel most often confused about particularly with quaternions in linear algebra

  • @vorfreu
    @vorfreu 3 ปีที่แล้ว +7

    It would be great if you could make a series on tensors like the one on number theory.
    There are not many pure mathematics perspective on this subject.
    Also great video

  • @whispercat56235
    @whispercat56235 3 ปีที่แล้ว

    Thank you very much for this, I hope to see more tensor related content from you !

  • @furutapark
    @furutapark 3 ปีที่แล้ว

    The way I really like to think about higher dimensional tensors is as arising from looking at the automorphism group of the vector space of linear automorphisms over a finite-dimensional vector space. If you have a finite-dimensional square matrix with columns c_i you can fix other square matrices A_i and for a matrix M, you can get another matrix A(M) with columns A_i(c_i). This is a linear transformation and provides an easy-to-understand generalization of 2-dimensional matrices. This is also a "natural" way to think of it in terms of Hom-Tensor adjunction.

  • @nicolasmarin7289
    @nicolasmarin7289 3 ปีที่แล้ว

    Bro I've seen tensor products in Quantum Computing for a bit now and I've never fully understood the exact motivation or way in which they are defined formally and I just finally understood (as a math major). Thank you so much.

    • @jacobianmigoto
      @jacobianmigoto 3 ปีที่แล้ว

      As a physics grad student, I have no idea what this * operator is.

    • @mediwise2474
      @mediwise2474 2 หลายเดือนก่อน

      pl explain tensir products in quantum commputing

  • @tomhase7007
    @tomhase7007 3 ปีที่แล้ว

    The last isomorphism R^2 \otimes R^3 -> M_2x3(R) becomes even more natural if you view the first R^2 as row vectors rather than column vectors; then it is just matrix mulitplication. Abstractly this is due to the fact that for vector spaces we have an isomorphism V^* \otimes W \to Hom(V,W), where V^* is the dual space of V and Hom denotes linear maps.

  • @ElPikacupacabra
    @ElPikacupacabra 3 ปีที่แล้ว

    Thank you for posting this. Great video. Yes, I'd like more on this topic, and on abstract algebra in general.

  • @punditgi
    @punditgi 3 ปีที่แล้ว

    Definitely want more of these videos!

  • @WindsorMason
    @WindsorMason 3 ปีที่แล้ว +1

    "...And that's a good place--"
    Cliffhanger! We never actually stopped, so the video must have a sequel.

  • @MrsHarryStlyes
    @MrsHarryStlyes 3 ปีที่แล้ว +1

    Very cool! I've never seen this sort of construction using quotient vector spaces in my mathematical physics studies. Love to see more on this viewpoint and/or others!

    • @MasterHigure
      @MasterHigure 3 ปีที่แล้ว +2

      Abstract algebra is all about quotients. They are the dual of subobjects (subspaces, subgroups, whatever), and that's really saying something about their importance ^^

    • @schmud68
      @schmud68 3 ปีที่แล้ว +2

      In my algebra course we defined tensor products as the unique vector space satisfying a universal property (universal property of tensors products). We didn't really cover much of the foundational proofs for this (like existence), but uniqueness follows quickly from the universal property assumption. Then we used quotients of tensor product spaces to create the symmetric product and exterior product

  • @antoniodisessa1469
    @antoniodisessa1469 3 ปีที่แล้ว +1

    Absolutely clear. I'd hope a more deep learning about the subject.

  • @musiclover1770
    @musiclover1770 2 ปีที่แล้ว +1

    I've heard that tensors are also viewable as multilinear functionals (functions from V x ... x V ----> F, where F is the field over which V is defined). Is there away to see these two views of tensors as equivalent?

  • @Evan-ne5bu
    @Evan-ne5bu 3 ปีที่แล้ว +1

    From a person who has little knowledge of linear algebra, I have to say that the only thing I didn't quite get was what is the star symbol in the formal product between 2 vector spaces. Is like an operation with very basic and "weak" properties from VxW to another vector space which is determined by the operation itself? Or is it something else?

    • @pbroks13
      @pbroks13 3 ปีที่แล้ว +1

      Basically just stick elements together, but you are not allowed to do anything with them. So for example 2*8 and 4*4 would be just two different elements and have no relationship to one another.

    • @Evan-ne5bu
      @Evan-ne5bu 3 ปีที่แล้ว +1

      @@pbroks13 so this Is just a way to attach elements? It's not like an operation or anything else? I have to say that's kind of disappointing

    • @billh17
      @billh17 3 ปีที่แล้ว

      ​@@Evan-ne5bu said " I have to say that's kind of disappointing"
      v * w is like defining a structure in a programming language. After having defined the structure, we can define operations on objects of that type.
      For example:
      ============================
      struct PreTensorProduct {
      v: vector
      w: vector
      init(v: vector, w: vector) {
      self.v = v
      self.w = w
      }
      }
      main() {
      let v = [2, 4]
      let w = [1, 2, 3]
      let pre_tensor_product = PreTensorProduct(v, w)
      print("v * w = {pre_tensor_product}")
      }
      ============================
      The * is a 'constant' function that takes two arguments v and w and returns a new object called the "*" product (or "PreTensorProduct" in the code example above).