I love how when the "teacher' @5:30 asks "You've all watched chapter 3, right?" the Pi Creature to the left rolls his eyes trying to look normal with that stressed out facial expression, like "nope.." Very meticulous with these videos I love it.
I study in Norway, and our professor recommended this series as a supplement resource to the course. I was pleasantly surprised he's humble enough to do so.
Vet ikke helt hvorfor det gjør han 'humble', videoene har veldig bra kvalitet og er en god måte å få bedre forståelse på, og som du sa er det ikke akkurat som det er faktisk del av pensum. så ville ikke helt sagt det var humble, men var i hvert fall kult gjort.
So there's an isomorphism between vector combination problema (cool plural for problem I made up) and linear combination problema. That said with your coordinate grid (basis vectors) being an axiomatic method of viewing space or information in mind, is there an isomorphism between matrices and tensors? Probably a better question would be, what are tensors?
Chapter 9: Change of Basis Chapter 10: Eigenvectors and Eigenvalues ... ... Chapter 100: A full derivation of Einstein's Field Equations, with applicable examples in cosmology Please?
The Physicist Cuber We're talking about renormalization in quantum electrodynamics, correct? I.e., the process of making infinities magically go away, correct?
Storytime: I watched this video about 6 months ago for the first time when I stumbled across this channel and decided to watch this series (as well as most of the other ones) and I thought. that's cool, doubt I'll ever need to use it but its still cool. Until today, I'm working on a University project where we essentially have to triangulate the GPS coordinates of an object based off of compass bearings from known locations we are doing this over a small enough area that the "grid" of GPS coordinates can be considered rectangular (so the maths is still linear algebra) but sadly stuff doesn't work as nicely as with square coordinates. so after a good few minutes of thinking about how to solve this problem and, getting very frustrated, I took a 5-minute break to rest the brain and BAM!!! I remembered this video and thought "I can just change the base to a rectangular one". So here is a massive thank you for this amazing content that I watch in passing and then next thing I know. I'm thinking back to it using it in a real-world engineering problem, Keep it up dude Love the content. really wish I could support you on Patreon but.....University.
I thought "I wonder what application would make Jennifer use that system instead of the "standard" one?" and BAM I found your comment! Haha thanks for sharing!
I was a student who had 91/100 in Linear Algebra but had no intuition about what was actually behind 2-3 pages long answers. After watching this series, my beliefs about linear algebra have completely changed. I owe you a lot!
I am LITERALLY on the floor having a seizure. My essential bodily functions are shutting down rapidly. The beauty of mathematics is just too overwhelming. Please send help.
6:44 This is what helped me solve the apparent "backwards" translation: Consider Emily, who uses a (1D) grid which is 5 times larger than ours (her 1D basis vector is 5). So, all of her coordinates are 5 times smaller than "our" coordinates for the same vector. 1 * 5 = 5 vector in her language * her basis in our language = vector in our language 5 * 1/5 = 1 vector in our language * inverse of her basis (in our language) = vector in her language
Wow, that's a great image to remember how the transformation works. Thanks a lot! However, I would recommend to write it the other way around (8:35) since matrix multiplication is in general not commutative; when you write it as follows you can extrapolate from it to 2D-space one to one. her/our language = her/our coordinates 5 * 1 = 5 her basis 1D-vector in our language * 1D-vector in her language = same 1D-vector in our language her basis 2D-vectors in our language * 2D-vector in her language = same 2D-vector in our language (as a matrix rerpresenting transformation) 1/5 * 5 = 1 inverse of her basis 1D-vector in our language * 1D-vector in our language = same 1D-vector in her language inverse of her basis 2D-vectors in our language * 1D-vector in our language = same 2D-vector in her language (as a matrix rerpresenting transformation)
You can use dimensional analysis to make it even more obvious! 1 Jennifer vector * (3 our vector / 1 Jennifer vector) = 3 our vector. 3 our vector * (1 Jennifer vector / 3 our vector) = 1 Jennifer vector. I like how you simplified it to 1D :)
All of these videos are such an emotional rollercoaster for me. First the confusion, then the illusion of understanding, then frustration when I realize I didn’t, finally the absolute awe when I get it and see the beauty behind it all
she could ask the same of us. i hat and j hat being perpendicular only really makes sense once we have a dot product, so without that, any (linearly independent) choice of basis vectors should work equally well
While the linear transformation used in the video was indeed a lot more complicated from her perspective, there will be other transformations which are far easier from her perspective then from ours! (This will be a part of next video I believe)
I think people often mistake watching these videos for a full understanding that you would get from an LA course, people just get different things out of it, for example someone who just watches this series might be able to conceptually describe all these concepts, but not be able to actually do the problems associated with them like someone who's taken a linear algebra course, but that second person may have a worse conceptual grasp. they're just different, this isn't a replacement of a uni course, it's more a useful supplement to it.
It surely gives us an easy understanding of the basics, but it is not enough for our own comprehension. Managing to get the point oneself, and then watching this video will give us a much advanced and thorough view of linear algebra: the intrinsic relation between calculation and visualization(intuition). We tend to use our intuition for having the point of mathematical actions, which means that studying this topic isn't enough by only a bunch of videos. It is necessary to make the bridge between the two of ourself.
God I do a course of LA on Coursera using these videos as a teaching material. I could NEVER be able to do this if I wasn’t watching these, since the teachers do a much worse job of explaining than 3b1b does. I would encourage everyone to do this to get the best of both worlds AND support 3b1b financially (as I do)
Finally a teacher that describes every part distinctly and coherently. My teacher just showed the algebraic part of changing basis but you put everything into reality now I fully understand. You are a genius
I have seen this video many times in the past. Now I am in my Ph.D. and whenever I need to refresh my understanding, I always think of Jennifer. Thank you for introducing the ever-complicated Jennifer.
@3Blue1Brown I’m a long-time watcher of this channel, and I love your content. Normally I don’t comment, but my Linear Algebra teacher told us to watch this video to learn the concept! I’m a student at the University of Washington. That’s a testament to your teaching- well done!!
0:34 --> Coordinates of a vector and Scaling Basis Vectors, Coordinate Systems 1:38 --> Different Coordinate System 4:14 ---> Translating Between Coordinate Systems 5:08 ---> Matrices
This series has been the perfect addition to my current linear algebra course at my university. The lectures are mostly just definitions and lots of jargon and its very difficult to grasp on its own. The visual representations in these videos along with your explanations have really opened my mind up to this beautiful subject and I am very grateful.
6:47-6:57. Geometricaly, this matrix transforms our grid into Jennnifer's grid. But Numerically, it's translating a vector from her language to ours. It's the key point, Thanks.
Nah. It doesn't matter. Tell me why he's showing that picture of Jennifer's grid after using inverse change of basis matrix for the rotated matrix? No matter which grid you use. The position of that vector doesn't change. Isn't it?
@@Robocat754 the position is relative you can not tell the position unless you have a grid so if the gird change, the position must change you think it did not change because the screen did not get distorted and it is fixed to our world
I think this part is quite hard to grasp. Specially because the matrix has two meanings in there, and it coincides to be the same matrix. 1. The matrix describes her basis in the standard basis notation (and coincidentally our basis is the same as the standard one) - this is the "her basis" matrix; 2. The matrix also transforms a vector from her basis into our basis, but this is only because our basis is on the standard one - this is the "her basis to our basis" matrix; I think that if our basis wasn't the standard [1,0;0,1] then the matrix for (2), the basis "translation" from her to ours matrix, wouldn't be the same as the matrix for (1), her basis matrix. For example, if our basis were [2,0;0,1], ie. our î axis is stretched by 2 times, the the transformation from her basis into ours would need to take our stretch into account, and thus the matrix representing that transformation would not be the same as her basis matrix.
You sir, have no idea how much this helps me as a math undergrad. I recommend the Calculus series to anyone who listens to me for more than 2 minutes. Now this one on Linear Algebra is a great help for me. You're doing great work and you should know that!
This is really helpful for me when I learn image processing now. For example, if I want to flip over an image from right to left, I found it was difficult in the past to transform the coordinates because the origin of an image is at the left upper corner and i hat points down and j hat points right. After this tutorial, now I have two ways to solve this problem. First I directly look for the transforming matrix in the image coordinate system. Second I translate the image coordinate system to x, y system and look for the transforming matrix in x, y system (this is a more natural way) and finally translate back to the image system. And I find that the first transforming matrix is the same as the composite matrix of A(-1)MA in the second way. So I can really find the amazing power of linear algebra once I can understand its implication behind the numbers.
I love you man. Rigid Body Kinematics, Control theory, Vibrations, Vehicle Dynamics, the majority of my masters boils down to this. Masters professors don't have time to teach the basics or solve fundamental doubts because of time-sensitive courses and frankly because they are basics. That's where you come in, a boon to humanity compensating for all the shitty education prior to this. Utmost respect and admiration for your intellect AND impeccable presentation skills. As soon as I start to earn, I'll make it a point to come back and donate generously. Genius!
I literally watched all the advertisements before all of the videos in this series because you deserve all the advertising dollars. It's the least I can do to thank you for deconstructing all these topics in a visually satisfying way that was not accomplished in school.
Mabelcorn N. Most courses don't have the time / resources. Also this is meant as an addendum to students of linear algebra, not as standalone linear algebra course. How do you visualize that R is an infinitely dimensional Q-vectorspace?
There are other fields besides mathematics, such as electrical engineering which I studies, that are using linear algebra and who do not need to visualize such things. But yes, this will not teach how to perform calculations, but would certainly help understanding many applications where linear algerbra steps in.
@@msergejev yeah... I am learning Lin All for mechanical engineering (also in a top 10 university), but the prof only shows the calculations and does not provide us the understanding of what we are really doing. This series is a blessing for me...
Man I just realized something crazy I recently started studying abstract algebra and I just realized that a linear transformation from some vector space A to some vector space B is an isomorphism (assuming it has an inverse if not its a homomorphism) since its commutes with the operations defined on the set, namely vector addition and scalar multiplication.That really helps me when thinking about different bases it’s kind of like when you have two isomorphic sets when you think of an element in one of the you have an associated element in the other set. Thats kind of what this is like each element in one vector space has another associated vector in another vector space with a different basis.Shits crazy
Reminds me of using commutators to come up with Rubik's cube moves. You will often: perform a sequence of moves, then change a small thing, then perform the inverse of that sequence of moves, then end by inverting the small change. Both are non-commutative so it seems similar.
This sounds indeed similar! Conjugating (thats the name for "multiplying something by something else from the right and the inverse of that fromj the left") appears in more areas than just with matrices
Random story but a great lecturer of mine put quite a bit of emphasis on the "perspective" perspective of conjugation, but when I talked to my tutor about it, she had never thought of it that way... and she was an algebraist. Surprising!
In Stefan Pochmann's methods (both original Pochmann and M2/R2, both typically used for blindsolving), the "setup moves" are a change of basis, essentially. If S is a setup move, and A is an algorithm that switches the desired pieces once set up, then S^-1•A•S denotes the switch of the desired pieces in situ.
This is absolutely the best explanation I have seen on this topic. The visual is world class. I learned this topic the old way without the visual transformation before and I was totally lost.
You know you should receive a Nobel Prize for these contributions to humanity. Through just these videos you have accelerated the progress of technology worldwide. I mean it. I am one of those engineers whose work is now better and faster because of you.
The last topic is a really nice explanation for why similar matrices have same eigenvalues. The eigenvectors are essentially fixed in space for a transformation, the eigenvectors differ because co-ordinate systems are different. But the scaling factor (the eigenvalues) only depends on the transformation, it is independent of the coordinate system. Therefore, similar matrices have same eigenvalues. Essentially similar matrices correspond to writing the same linear transformation in a different coordinate system.
Wow. Once again, just wow. We live in a time where enlightenment is spoon-fed us by incredibly talented conveyors of the complex, for free on a platform that also shows cats dancing. I am most grateful! Engineer, Copenhagen
Im reading Leonard Susskind book on Quantum Mechanics. Its a wonderfully concise book but it assumes that you know linear algebra, so I went to Khan. A couple of months later this serie of videos arrived, thank you!!! Im having a great trip. I understand English, but its really cool that Spanish subtitles are available also, I would recommend it to everyone!
I've been watching the MIT OpenCourseWare lectures on linear algebra with Prof. Strang on TH-cam, then coming here and watching your video on the same material. After following the wonderfully detailed explanations from Prof Strang, your visualizations of the concepts add a new level of understanding. Thanks!
- This video should be the first one to watch for this series -> to find who Jennifer is , we will watch all videos ! - The Jennifer's concept: is like explaining a problem/idea( in professoinal field) to a child. You will need have the child's mind/space (kind of the what the knowlege the child could have - not professional's knowlege) to speak the kid language so that you can talk about that problem/idea to them easily.
Look at vectors while the basis vectors i and j are fixed most of the other vectors are fluid And have both i and j charachter They can also transform themselves if they want to So much to learn for humans from these vectors
I am currently a CS students, the concept of linear algebra has always been confusing me as i do not know what actually what this will be used for, until now that i know this channel, thank you for all the visualization and the way you show us that math is more than just computation. Thank you teacher!
After watching 6:43 for maybe the fifth time this year you finally made that click for me. THANK YOU! Our lecturer uses notation that says we're going from base B to A when it really feels like we're going from A to B. I now see why! Your videos truly are the best!
Thank you SO MUCH! I am in a Linear Algebra class in engineering college, and my professor only ever talks in abstracts and proofs, never explaining anything in layman's terms. I have been working on change of basis for a few weeks now, never fully understanding it, and this video is what finally made it all make sense. I legit almost cried when it all just clicked in my head. So, again, THANK YOU, 3Blue1Brown!
also engineering student studying Linear Alg and i skipped like 3 weeks worths of lectures since our prof is dogshit but now we got an assignment and it's 10% of the grade so I'm relying on 3b1b to help me not get an F 😔
Oh my god, you made it so clear all this story of the matrix applied to the grid or to the vectors.... Will you do some stuff about topology ? It would be great :D
@3Blue1Brown, thanks you so much. Even when I've already knew this things about linear algebra from university it is good to construct intuition of visual represented connections of all this concepts. It is even more nice to know of your plans about topology videos.
The idea of thinking matrices as linear transformations has magnificently improved my understanding of linear algebra. Thanks for your effort, wish you best.
I'm not sure if 3Blue1Brown will be covering this - I don't think it really belongs in this "essence" series - but a really cool question for viewers to ask to play around with this concept is: how does Jennifer compute "dot products" in her coordinate system? The interplay between duality and the concept of change-of-basis is beautiful and important.
I'm glad you brought this up. It was in the original script, but I cut it out to stay, as you said, in the essence. I remember being truly surprised at the fact that the dot product is, in some sense, a choice that we make, and least in the context of an abstract vector space.
Out of my head I could imagine two obvious options: One practically inherits the dot product: You would take the dot product of two of Jennifers vectors by first transforming them back to "normal" vectors and rhen taking the normal dot product. The other option would be to just take tge dot product of Jennifers vectors like normal. There's an interesting difference because with the second option, jennifers basis vectors will always be orthogonal with length 1, while they normally wont with the first
Really well-said! Both interpretations are interesting. Casting aside our "mathematical empathy" suppose we say that our dot product is "right" and hers is "wrong." Then in order for Jennifer to take the "right" dot product, she would have to do what you just said: convert her vectors from her coordinate system to ours before computing. On the other hand, if Jennifer insists that her vectors are, as you say, orthogonal with length 1, then we're at an impasse: she will simply get different answers than we. By the way, you could also ask the question, if we were to "empathize" with Jennifer and agree that her dot product is "right," then you could ask, how would we go about doing that? How could we modify our dot product formula so that we get the same answers she would get? You'll find that there's an interesting difference between the first question and the second: one requires us ultimately to take an inverse matrix. Do you see which one? It's kind of like, if I'm talking to a European (I'm an American) and I want to describe the weather, I'll convert the Fahrenheit to Celsius for them, because neither system is "wrong" per se, but they'll understand it better if I do that.
This is so good!!!! He puts things in that you notice in your second watching! For Example her a vector in her basis is PINK and in our basis is BLUE!!!! (And notice how the matrices that transforms her to ours shift from pink to blue) (And the inverse goes from blue to pink )
@@totheknee There are two ways of finding that inverse: The "Don't ask Me Anything" phrase, which could look like the person just doesn't want to answer whatever you want to ask him. The "There exists a topic which I don't want you to ask me about" phrase, which sound unnatural, but is equivalent (logically speaking) to the previous phrase (even tho It doesn't have the same connotation), which seems like the person just doesn't want to talk about "something", but anything else would be fine. So, if A^-1MA = Ask Me Anything, then the inverse of the entire thing (A^-1MA)^-1 = (A^-1)(M^-1)A means one of two things: I don't want to talk about something, or I don't want to talk about anything. I'm sorry, I'm bored :3.
6:57 actually this matrix transforms "the grid Jennifer sees" into "the grid we see", since everyone sees his own basis as the standard unit grid. In this way, the matrix transforms a vector "in her language" to "our language". This would be a more natrual way to describe it. : )
You taught us a word (A M^-1 A) that looks like a terror organization name could actually gives empathy to the others :) Thanks for creating such content sir
5:32 I love how the left pi student is looking to the side to the question like *cough*Of course I've watched chapter 3*cough cough* I have to admit, it's beautiful how you've described similar matrices with empathy. I would have loved if you had gone into covectors here too, they are natural when you get to this point. I always mix up the way the way each transformation matrix works...is there any way to remember that more easily? I learned all of this the hard, computing and strict way.
I love how you color the brackets at 10:58. I remember this part as one of the more difficult parts of linear algebra, but the way you describe it makes it really easy to understand :)
You cannot believe how thankful I am for the existence in this video. I really hope wherever you are in the world, you have a peaceful life full of wonder and learning and much deserved happiness.
This series is awesome. I am trying to understand deeper, and have a question: At 9:50, you say, "How would Jennifer describe this same 90-degree rotation of space?", I think in Jennifer's space, this rotation is not 90-degree(even not a rotation). Maybe we could say, "How would Jennifer describe a transformation which from our perspective is a 90-degree rotation?" Thank you!
I totally agree with you. The video part after 9:50 explains how Jennifer would see our 90-degree transformation. But if we want to see Jennifer-90-degree transformation in our perspective, first we should rotate the general (x,y) vector by 90-degree, ie, by multiplying by that [0,1,1,0] matrix and then converting it to our grid by A-inverse
Deepto Chatterjee - maybe her angles are messed up! If you look into an arbitrary vector space (over the real or complex numbers), sometimes you can develop a notion of an _inner product._ An inner product is a generalization of the notion of the dot product. It is an operation that takes two vectors in your vector space as an input and outputs a real number, subject to certain properties. A vector space together with a specified inner product is called an inner product space. In an inner product space, we _define_ the length of a vector to be the square root of the inner product of that vector with itself; symbolically, ||v|| = √(in(v,v)). And we _define_ the angle between two vectors to be the arccosine of the inner product of those vectors divided by the product of the lengths of those vectors; symbolically, θ = arccos(in(v,w)/(||v||∙||w||)). Knowing this, if Jennifer decides to use the "dot product" with respect to her basis vectors (which would be an inner product on R^2), then her basis vectors would appear to her to be orthogonal and of length 1! And our basis vectors would appear to her _not_ to be orthogonal or of length 1. Of course, Jennifer could also choose a different inner product from the "dot product" with respect to her basis vectors, in which case her basis vectors may not be orthogonal or of length 1 using that inner product.
Nope! Here, everyone sees space the same way. A 90 degree rotation to us is a 90 degree rotation to Jennifer. What's different is how she DESCRIBES space. So, she needs a different matrix to describe a 90 degree rotation.
@@rynin8019 6 months late, but from the video, what looks like a 90 degree angle in our space looks like a greater than 90 degree angle in space from her perspective. So while the rotation is equivalent in both coordinate systems, I am not sure we all "see space the same way."
Until now, I've used A-1XA not knowing why even during the math class in my university. This expressions suggests take any vector into our perspective and transform with M, then change the viewpoint back to the original one. So impressive!! Thanks a lot
I wouldn't use a computer to compute the inverse. The mechanical, generic computation is complex and computationally expensive, while its often possible to work out a much simpler and more efficient set of computations on paper -- thus, it often makes more sense to find the inverse matrix by hand and then program the computer to use it. At least, this is how I handled inverting the projection matrix used in 3D rendering.
I think the point he makes is more general. It may be easier for you to invert your matrix by hand, but what if you need to invert a 1000x1000 matrix whose values change depending on a previous set of steps? Do you then have the system print out that matrix and wait for your input to provide it the new inverted matrix? In reality, you should work to form an equivalent statement that does not require you to invert a matrix wherever possible (find symmetric matrices maybe) and then bring down the number of computations you'd have to do. However, It is almost never worth inverting these matrices by hand.
6:44 I wrapped my head around it by considering active vs passive transformations. - As an active transformation, the matrix maps every vector in our coordinate system, to the corresponding vectors of Jennifer's system, without changing our point of view, so our coordinate system stays constant. The entire grid of vectors has changed on its own, thus the name 'active' transformation. It's like moving an object you're filming without changing the position of an camera. - As a passive transformation, we're changing our point of view backwards, from Jennifer's to our coordinate system, so we're kind of translating everything described in Jennifer's language to our language, without changing the true position of vectors. You can think of it as moving the camera itself while not touching the object. No matter in which way you view the transformation, the math behind it stays pretty much the same. And also I think it'd be a lot easier for us to grasp the 'backwards' phenomenon if you could introduce the above in the video.
Its quite confusing to talk about translating vectors as converting them between coordinate systems. When I think "vector translation", I think about adding vectors together, and thus shifting them. Great video though, I wish I could like it multiple times... really man, you deserve more for this awesome work!
Even the quotation at the beginning makes a lot of sense when you watch the video a second time! You craft your videos so well, Grant! You're amazing!!
Grant says at 6:48 : "Geometrically this matrix [[2 1], [-1 1]] transforms our grid into Jennifer's grid but numerically it's translating a vector described in her to our language." I think a better interpretation is to think that the matrix [[2 1], [-1 1]] expresses her perspective in our perspective. That is, the matrix [[2 1], [-1 1]] does not transform our grid into her grid, rather it EXPRESSES her grid in our grid system. It is as if her grid is overlaid or superimposed on our grid, and we are reading off from our grid anything that she expresses with her grid. We are still in our grid. A weak analogy would be that we are wearing correcting lenses (I.e the matrix [[2 1], [-1 1]]) which translates her worldview into our worldview. I think that here we cannot speak of transformation like before, because here the two grid systems are coexistent and simultaneous. When we adopt this interpretation, everything seems natural and logical.
9:15, it's be more useful if you make it specific that the 90 rotation transformation is a transformation in OUR coordinate system but NOT Jennifer's, if it's in Jennifer's there's would be no need for all that formula at all.
I had the same doubt, I thought shouldn't Jennifer multiply the same rotation matrix if she wants the vector in her coordination system to be rotated by 90 degrees.
This took me a while to figure out too, I think it's one part that could've been better explained. It finally hit me that the two vectors in some arbitrary basis might not be 90 degrees from each other, so the orthonormal basis transformations will be encoding 'flips' of basis vectors that are NOT 90 degree rotations. To get what a 90 degree flip is you need to find out your basis vectors are in terms of her basis and then flip them according to the result.
Linear transformation works in one basis system (coordinate system). A change of basis is a transformation from one coordinate system to another. They are similar in that you can use a linear transformation to define your new basis vectors. They are both the same thing, it depends on the context
Put another way, a change of basis is an _application_ of a linear transformation to change coordinate systems. Sometimes a problem is easier to reason about in a different coordinate system, and a specific linear transformation lets you do that.
0:00 intro 0:13 coordinates as scalars 1:17 change of basis vectors 3:23 the grid is a construct 4:11 how do you translate between coordinate systems? 7:15 what about the other way around? 8:27 in a nutshell 8:51 how to translate a linear transformation 12:22 outtro
I love the detail he adds to the videos. Just look at the color transition in the matrix which represents the change in basis. Kudos to his involvement and patience to work on all these details..!!! :)
If both [1, 2] and [-1, 1] are in Jennifer's space, why is their angle not equal to 90 degrees? Does a 90-degree angle in her space not use the same definition as ours?
Great question! Almost all of the time, the angle between a pair of vectors that are described with some basis B will change whenever that same pair of vectors is described with a new basis B' His next video in this series, the one on eigenvectors and eigen values, explains this concept well!
This throws a whole new light to what I’d been taught - but had barely conceptually understood - in my mathematical methods in physics class. Thank you so much. Bless you. 💖
Thank you, for making mathematics as elegant as thinking in your first language. :) I'm a huge fan of your channel and how I fall in love with any topic you cover in your videos. I hope to be a teacher like you if and when I become one.
I freaking love ur videos. This makes me wonder what the heck was I and my professor doing in my linear algebra class. I can't even express how much I'm impressed by the intuition I gain by watching these videos. It's like a whole new world!
Wait. You call them Eigenvektors in English? That's straight out of German! And Eigenvalues translates only the last part of the word into english... Interesting. Why is that? Was it a German that standardized this part of mathematics?
The "Eigen" is used in English. This is because Hilbert coined the prefix "Eigen-" for characteristic (intrinsic, coordinate independent) properties. (faql.de/etymologie.html#eigen ) I guess he was the one who really started to figure out "Eigen-"properties on a higher, more abstract level than just linear algebra.
iSquared while mathmethicans isn't what Germany is most famous for (other than physicists, but these all had to be good in mathematics too btw), there certainly are Germans that are famous for their mathematic archivements. Ever heard of Gauß (or like you would probably write it Gauss)?
The empathy analogy is probably the best intuition I had about linear algebra ever. Your course covers for memorizing equations of thousands of students. Thank you.
I love how when the "teacher' @5:30 asks "You've all watched chapter 3, right?" the Pi Creature to the left rolls his eyes trying to look normal with that stressed out facial expression, like "nope.." Very meticulous with these videos I love it.
Wow, I didn't notice 😂😂😂
I was looking for this comment :)
@@paprikar same lol
He's a backbencher 😅
"Yeah... I totally watched it"
I study in Norway, and our professor recommended this series as a supplement resource to the course. I was pleasantly surprised he's humble enough to do so.
Hva studerer du, og hvor?
Vet ikke helt hvorfor det gjør han 'humble', videoene har veldig bra kvalitet og er en god måte å få bedre forståelse på, og som du sa er det ikke akkurat som det er faktisk del av pensum. så ville ikke helt sagt det var humble, men var i hvert fall kult gjort.
Norgee🇳🇴. Matte student fra Blindern her
Jag studerar i sverige 🇸🇪🇸🇪🇸🇪
You are playing an important role in my formation. Thanks.
Glad I could help! It's always super rewarding to read a comment like this.
:)
So there's an isomorphism between vector combination problema (cool plural for problem I made up) and linear combination problema. That said with your coordinate grid (basis vectors) being an axiomatic method of viewing space or information in mind, is there an isomorphism between matrices and tensors? Probably a better question would be, what are tensors?
Incredibile come rimanga la memoria di quel grande artista del quale porti il nome, anche dopo la sua morte!
Sempre libero.
Chapter 9: Change of Basis
Chapter 10: Eigenvectors and Eigenvalues
...
...
Chapter 100: A full derivation of Einstein's Field Equations, with applicable examples in cosmology
Please?
.... Chapter 200 Renormalization in QED
The Physicist Cuber But this is a mathematics channel, not a magician channel.
WTF?!
The Physicist Cuber We're talking about renormalization in quantum electrodynamics, correct? I.e., the process of making infinities magically go away, correct?
lol xD
Storytime: I watched this video about 6 months ago for the first time when I stumbled across this channel and decided to watch this series (as well as most of the other ones) and I thought. that's cool, doubt I'll ever need to use it but its still cool. Until today, I'm working on a University project where we essentially have to triangulate the GPS coordinates of an object based off of compass bearings from known locations we are doing this over a small enough area that the "grid" of GPS coordinates can be considered rectangular (so the maths is still linear algebra) but sadly stuff doesn't work as nicely as with square coordinates. so after a good few minutes of thinking about how to solve this problem and, getting very frustrated, I took a 5-minute break to rest the brain and BAM!!! I remembered this video and thought "I can just change the base to a rectangular one". So here is a massive thank you for this amazing content that I watch in passing and then next thing I know. I'm thinking back to it using it in a real-world engineering problem, Keep it up dude Love the content. really wish I could support you on Patreon but.....University.
👏👏😇
I thought "I wonder what application would make Jennifer use that system instead of the "standard" one?" and BAM I found your comment! Haha thanks for sharing!
BAM im glad i stumbled upon this comment
BAM
BAM
I was a student who had 91/100 in Linear Algebra but had no intuition about what was actually behind 2-3 pages long answers. After watching this series, my beliefs about linear algebra have completely changed. I owe you a lot!
Grant. Beautiful. Literally tears in my eyes. This isn't just a math education video. It's a valuable contribution to humanity.
Hell yes!
Could you be any more dramatic?
I am LITERALLY on the floor having a seizure. My essential bodily functions are shutting down rapidly. The beauty of mathematics is just too overwhelming. Please send help.
Ah c'mon, maths can be beautiful but don't exaggerate. Tearing up because of it seems a little too dramatic.
its called Stendhal syndrome when when you see something so beautiful you enter a psychosomatic state. its a good thing usually
6:44 This is what helped me solve the apparent "backwards" translation:
Consider Emily, who uses a (1D) grid which is 5 times larger than ours (her 1D basis vector is 5). So, all of her coordinates are 5 times smaller than "our" coordinates for the same vector.
1 * 5 = 5
vector in her language * her basis in our language = vector in our language
5 * 1/5 = 1
vector in our language * inverse of her basis (in our language) = vector in her language
Thanks
great
Wow, that's a great image to remember how the transformation works. Thanks a lot! However, I would recommend to write it the other way around (8:35) since matrix multiplication is in general not commutative; when you write it as follows you can extrapolate from it to 2D-space one to one.
her/our language = her/our coordinates
5 * 1 = 5
her basis 1D-vector in our language * 1D-vector in her language = same 1D-vector in our language
her basis 2D-vectors in our language * 2D-vector in her language = same 2D-vector in our language
(as a matrix rerpresenting transformation)
1/5 * 5 = 1
inverse of her basis 1D-vector in our language * 1D-vector in our language = same 1D-vector in her language
inverse of her basis 2D-vectors in our language * 1D-vector in our language = same 2D-vector in her language
(as a matrix rerpresenting transformation)
Thanks! I *almost* had it down intuitively at the end of the video, but was still slightly confused. This made it sink in properly :) Well worded!
You can use dimensional analysis to make it even more obvious!
1 Jennifer vector * (3 our vector / 1 Jennifer vector) = 3 our vector.
3 our vector * (1 Jennifer vector / 3 our vector) = 1 Jennifer vector.
I like how you simplified it to 1D :)
All of these videos are such an emotional rollercoaster for me. First the confusion, then the illusion of understanding, then frustration when I realize I didn’t, finally the absolute awe when I get it and see the beauty behind it all
Why does Jennifer has to be so complicated tho
she could ask the same of us. i hat and j hat being perpendicular only really makes sense once we have a dot product, so without that, any (linearly independent) choice of basis vectors should work equally well
From her perspective, we're the ones complicating things.
for her, we are the complicated ones. it's like different units of measure, except in more than one dimension.
While the linear transformation used in the video was indeed a lot more complicated from her perspective, there will be other transformations which are far easier from her perspective then from ours! (This will be a part of next video I believe)
Ask women.
Sir you are undoubtedly the best teacher i ever had...
Bhai hamare studd hai
My algebra teacher turned off the lights and read from slides. Course notes saved my ass, wish I'd seen this at the time
It's amazing that a free collection of about a dozen short TH-cam videos is incomparably more effective than any semester long university course.
@Swagger Kekker lol yes
I think people often mistake watching these videos for a full understanding that you would get from an LA course, people just get different things out of it, for example someone who just watches this series might be able to conceptually describe all these concepts, but not be able to actually do the problems associated with them like someone who's taken a linear algebra course, but that second person may have a worse conceptual grasp. they're just different, this isn't a replacement of a uni course, it's more a useful supplement to it.
It surely gives us an easy understanding of the basics, but it is not enough for our own comprehension. Managing to get the point oneself, and then watching this video will give us a much advanced and thorough view of linear algebra: the intrinsic relation between calculation and visualization(intuition).
We tend to use our intuition for having the point of mathematical actions, which means that studying this topic isn't enough by only a bunch of videos. It is necessary to make the bridge between the two of ourself.
God I do a course of LA on Coursera using these videos as a teaching material. I could NEVER be able to do this if I wasn’t watching these, since the teachers do a much worse job of explaining than 3b1b does. I would encourage everyone to do this to get the best of both worlds AND support 3b1b financially (as I do)
@@MKWiiLuke4TWVery well put, I think it's accurate.
Finally a teacher that describes every part distinctly and coherently. My teacher just showed the algebraic part of changing basis but you put everything into reality now I fully understand. You are a genius
Same here. Our book barely described the intuition behind all this. I had to memorize it all until I found this vid.
isn't milk expensive protein? beans and rice seem cheaper...
I have seen this video many times in the past.
Now I am in my Ph.D. and whenever I need to refresh my understanding, I always think of Jennifer.
Thank you for introducing the ever-complicated Jennifer.
@3Blue1Brown
I’m a long-time watcher of this channel, and I love your content. Normally I don’t comment, but my Linear Algebra teacher told us to watch this video to learn the concept! I’m a student at the University of Washington. That’s a testament to your teaching- well done!!
0:34 --> Coordinates of a vector and Scaling Basis Vectors, Coordinate Systems
1:38 --> Different Coordinate System
4:14 ---> Translating Between Coordinate Systems
5:08 ---> Matrices
This series has been the perfect addition to my current linear algebra course at my university. The lectures are mostly just definitions and lots of jargon and its very difficult to grasp on its own. The visual representations in these videos along with your explanations have really opened my mind up to this beautiful subject and I am very grateful.
6:47-6:57. Geometricaly, this matrix transforms our grid into Jennnifer's grid. But Numerically, it's translating a vector from her language to ours. It's the key point, Thanks.
Yess very very important point.
Nah. It doesn't matter. Tell me why he's showing that picture of Jennifer's grid after using inverse change of basis matrix for the rotated matrix?
No matter which grid you use. The position of that vector doesn't change. Isn't it?
@@Robocat754
the position is relative
you can not tell the position unless you have a grid
so if the gird change, the position must change
you think it did not change because the screen did not get distorted and it is fixed to our world
I think this part is quite hard to grasp. Specially because the matrix has two meanings in there, and it coincides to be the same matrix.
1. The matrix describes her basis in the standard basis notation (and coincidentally our basis is the same as the standard one) - this is the "her basis" matrix;
2. The matrix also transforms a vector from her basis into our basis, but this is only because our basis is on the standard one - this is the "her basis to our basis" matrix;
I think that if our basis wasn't the standard [1,0;0,1] then the matrix for (2), the basis "translation" from her to ours matrix, wouldn't be the same as the matrix for (1), her basis matrix.
For example, if our basis were [2,0;0,1], ie. our î axis is stretched by 2 times, the the transformation from her basis into ours would need to take our stretch into account, and thus the matrix representing that transformation would not be the same as her basis matrix.
You sir, have no idea how much this helps me as a math undergrad. I recommend the Calculus series to anyone who listens to me for more than 2 minutes. Now this one on Linear Algebra is a great help for me. You're doing great work and you should know that!
Those animations are most important part in Your videos. You are doing great job in explaining but animations deserve its own appreciation.
This is really helpful for me when I learn image processing now. For example, if I want to flip over an image from right to left, I found it was difficult in the past to transform the coordinates because the origin of an image is at the left upper corner and i hat points down and j hat points right. After this tutorial, now I have two ways to solve this problem. First I directly look for the transforming matrix in the image coordinate system. Second I translate the image coordinate system to x, y system and look for the transforming matrix in x, y system (this is a more natural way) and finally translate back to the image system. And I find that the first transforming matrix is the same as the composite matrix of A(-1)MA in the second way.
So I can really find the amazing power of linear algebra once I can understand its implication behind the numbers.
After all readings and stuffs, you're the only teacher who teaches what it truly means!! Thanks a lot! Now I think I could say "I love math!"
确实讲的挺好的
You are making what was my least favorite math subject one of my favorites!
yeah
My thirst from highschool finally being quienced in my late twenties. Maths is less scary now, I feel more confident. Thank you!
I love you man. Rigid Body Kinematics, Control theory, Vibrations, Vehicle Dynamics, the majority of my masters boils down to this. Masters professors don't have time to teach the basics or solve fundamental doubts because of time-sensitive courses and frankly because they are basics. That's where you come in, a boon to humanity compensating for all the shitty education prior to this. Utmost respect and admiration for your intellect AND impeccable presentation skills. As soon as I start to earn, I'll make it a point to come back and donate generously. Genius!
I literally watched all the advertisements before all of the videos in this series because you deserve all the advertising dollars. It's the least I can do to thank you for deconstructing all these topics in a visually satisfying way that was not accomplished in school.
Ok, these videos honestly save my life. I can't believe linear algebra isn't taught this way all the time! 😲
Mabelcorn N.
Most courses don't have the time / resources.
Also this is meant as an addendum to students of linear algebra, not as standalone linear algebra course.
How do you visualize that R is an infinitely dimensional Q-vectorspace?
There are other fields besides mathematics, such as electrical engineering which I studies, that are using linear algebra and who do not need to visualize such things. But yes, this will not teach how to perform calculations, but would certainly help understanding many applications where linear algerbra steps in.
@@msergejev yeah... I am learning Lin All for mechanical engineering (also in a top 10 university), but the prof only shows the calculations and does not provide us the understanding of what we are really doing. This series is a blessing for me...
@@weissachpassion Its a blessing for all of us, tho for me it came about 10 years too late :)
@@msergejev hmmm so how would you visualize linear algebra's Analogies in EE? like Eigenfunction for example?
Amazing! Looking forward with anticipation to understanding eigenvectors. It was really the initial push to watch the series.
I could never ever ever understand/visualize change of basis until I discovered your videos. You sir are a godsend!!
ありがとうございます!
I am grateful that my professor made a terrible job so I could discover this masterpiece.
I like your perception basis vector
I AM SO PROUD OF MYSELF FOR INTUITING THAT YOU COULD REPRESENT A CHANGE OF BASIS AS A SIMPLE LINEAR TRANSFORMATION OMG
Man I just realized something crazy I recently started studying abstract algebra and I just realized that a linear transformation from some vector space A to some vector space B is an isomorphism (assuming it has an inverse if not its a homomorphism) since its commutes with the operations defined on the set, namely vector addition and scalar multiplication.That really helps me when thinking about different bases it’s kind of like when you have two isomorphic sets when you think of an element in one of the you have an associated element in the other set. Thats kind of what this is like each element in one vector space has another associated vector in another vector space with a different basis.Shits crazy
This is the part I feel like is absolutely awesome... coordinates become vectors and vectors become coordinates , and they span everything.
Reminds me of using commutators to come up with Rubik's cube moves. You will often: perform a sequence of moves, then change a small thing, then perform the inverse of that sequence of moves, then end by inverting the small change. Both are non-commutative so it seems similar.
This sounds indeed similar! Conjugating (thats the name for "multiplying something by something else from the right and the inverse of that fromj the left") appears in more areas than just with matrices
Random story but a great lecturer of mine put quite a bit of emphasis on the "perspective" perspective of conjugation, but when I talked to my tutor about it, she had never thought of it that way... and she was an algebraist. Surprising!
In Stefan Pochmann's methods (both original Pochmann and M2/R2, both typically used for blindsolving), the "setup moves" are a change of basis, essentially. If S is a setup move, and A is an algorithm that switches the desired pieces once set up, then S^-1•A•S denotes the switch of the desired pieces in situ.
You might find this interesting: th-cam.com/video/syyK6hTWT7U/w-d-xo.html
@@BlueGiant69202 what is the blue giant your username is namesake to? Is it Ymir?
Thanks!
This is absolutely the best explanation I have seen on this topic. The visual is world class. I learned this topic the old way without the visual transformation before and I was totally lost.
You know you should receive a Nobel Prize for these contributions to humanity. Through just these videos you have accelerated the progress of technology worldwide. I mean it. I am one of those engineers whose work is now better and faster because of you.
The last topic is a really nice explanation for why similar matrices have same eigenvalues. The eigenvectors are essentially fixed in space for a transformation, the eigenvectors differ because co-ordinate systems are different. But the scaling factor (the eigenvalues) only depends on the transformation, it is independent of the coordinate system. Therefore, similar matrices have same eigenvalues.
Essentially similar matrices correspond to writing the same linear transformation in a different coordinate system.
Wow. Once again, just wow. We live in a time where enlightenment is spoon-fed us by incredibly talented conveyors of the complex, for free on a platform that also shows cats dancing. I am most grateful!
Engineer, Copenhagen
Im reading Leonard Susskind book on Quantum Mechanics. Its a wonderfully concise book but it assumes that you know linear algebra, so I went to Khan. A couple of months later this serie of videos arrived, thank you!!! Im having a great trip. I understand English, but its really cool that Spanish subtitles are available also, I would recommend it to everyone!
I've been watching the MIT OpenCourseWare lectures on linear algebra with Prof. Strang on TH-cam, then coming here and watching your video on the same material. After following the wonderfully detailed explanations from Prof Strang, your visualizations of the concepts add a new level of understanding. Thanks!
Your introduction makes me so happy.
- This video should be the first one to watch for this series -> to find who Jennifer is , we will watch all videos !
- The Jennifer's concept: is like explaining a problem/idea( in professoinal field) to a child.
You will need have the child's mind/space (kind of the what the knowlege the child could have - not professional's knowlege) to speak the kid language so that you can talk about that problem/idea to them easily.
Basis vectors are a social construct. Stop trying to assign them i or j at birth
Nah, the standard basis is special. 0 and 1 are part of the definition of vector spaces.
@@MrCmon113
You frickin racist!
Vectorist not racist
Sorry, but I find transvectors creepy.
Look at vectors while the basis vectors i and j are fixed most of the other vectors are fluid
And have both i and j charachter
They can also transform themselves if they want to
So much to learn for humans from these vectors
I am currently a CS students, the concept of linear algebra has always been confusing me as i do not know what actually what this will be used for, until now that i know this channel, thank you for all the visualization and the way you show us that math is more than just computation. Thank you teacher!
5:58 Brilliant, such a clear way of thinking about it -- so helpful!
After watching 6:43 for maybe the fifth time this year you finally made that click for me. THANK YOU! Our lecturer uses notation that says we're going from base B to A when it really feels like we're going from A to B. I now see why! Your videos truly are the best!
Perfect mix of mathematics and philosophy. I absolutely love it!
Thank you SO MUCH! I am in a Linear Algebra class in engineering college, and my professor only ever talks in abstracts and proofs, never explaining anything in layman's terms.
I have been working on change of basis for a few weeks now, never fully understanding it, and this video is what finally made it all make sense. I legit almost cried when it all just clicked in my head.
So, again, THANK YOU, 3Blue1Brown!
also engineering student studying Linear Alg and i skipped like 3 weeks worths of lectures since our prof is dogshit but now we got an assignment and it's 10% of the grade so I'm relying on 3b1b to help me not get an F 😔
Oh my god, you made it so clear all this story of the matrix applied to the grid or to the vectors....
Will you do some stuff about topology ? It would be great :D
The video I'm planning to do after this series will be related to topology. As to a whole "Essence of topology" series, maybe later.
@3Blue1Brown, thanks you so much.
Even when I've already knew this things about linear algebra from university it is good to construct intuition of visual represented connections of all this concepts.
It is even more nice to know of your plans about topology videos.
Oh my god, that's great. Looking forward to it.
+3Blue1Brown I am looking forward to it!! :)
nst as storix or not
The idea of thinking matrices as linear transformations has magnificently improved my understanding of linear algebra. Thanks for your effort, wish you best.
I'm not sure if 3Blue1Brown will be covering this - I don't think it really belongs in this "essence" series - but a really cool question for viewers to ask to play around with this concept is: how does Jennifer compute "dot products" in her coordinate system? The interplay between duality and the concept of change-of-basis is beautiful and important.
I'm glad you brought this up. It was in the original script, but I cut it out to stay, as you said, in the essence. I remember being truly surprised at the fact that the dot product is, in some sense, a choice that we make, and least in the context of an abstract vector space.
I think that deserves a footnote video.
How does she?
Out of my head I could imagine two obvious options: One practically inherits the dot product: You would take the dot product of two of Jennifers vectors by first transforming them back to "normal" vectors and rhen taking the normal dot product. The other option would be to just take tge dot product of Jennifers vectors like normal. There's an interesting difference because with the second option, jennifers basis vectors will always be orthogonal with length 1, while they normally wont with the first
Really well-said! Both interpretations are interesting. Casting aside our "mathematical empathy" suppose we say that our dot product is "right" and hers is "wrong." Then in order for Jennifer to take the "right" dot product, she would have to do what you just said: convert her vectors from her coordinate system to ours before computing. On the other hand, if Jennifer insists that her vectors are, as you say, orthogonal with length 1, then we're at an impasse: she will simply get different answers than we. By the way, you could also ask the question, if we were to "empathize" with Jennifer and agree that her dot product is "right," then you could ask, how would we go about doing that? How could we modify our dot product formula so that we get the same answers she would get?
You'll find that there's an interesting difference between the first question and the second: one requires us ultimately to take an inverse matrix. Do you see which one?
It's kind of like, if I'm talking to a European (I'm an American) and I want to describe the weather, I'll convert the Fahrenheit to Celsius for them, because neither system is "wrong" per se, but they'll understand it better if I do that.
This is so good!!!!
He puts things in that you notice in your second watching!
For Example her a vector in her basis is PINK and in our basis is BLUE!!!! (And notice how the matrices that transforms her to ours shift from pink to blue) (And the inverse goes from blue to pink )
A-1 M A That I see in so many places, finally I got what it means =)
This is what which is called as a similarity transform - I guess!
We used this to convert local coordinates to global coordinates in matrix analysis of structures.
The inverse of Ask Me Anything?
@@totheknee There are two ways of finding that inverse:
The "Don't ask Me Anything" phrase, which could look like the person just doesn't want to answer whatever you want to ask him.
The "There exists a topic which I don't want you to ask me about" phrase, which sound unnatural, but is equivalent (logically speaking) to the previous phrase (even tho It doesn't have the same connotation), which seems like the person just doesn't want to talk about "something", but anything else would be fine.
So, if A^-1MA = Ask Me Anything, then the inverse of the entire thing (A^-1MA)^-1 = (A^-1)(M^-1)A means one of two things: I don't want to talk about something, or I don't want to talk about anything.
I'm sorry, I'm bored :3.
Just found this channel. It changed my life, thank you!
I love your work here, man. How can we donate?
well he has a bitcoin adress so you can send him some coins ?under about in his youtube channel
he also has a patreon: www.patreon.com/3blue1brown
Best way nowadays is donating BATs if you use the Brave browser.
I cannot tell you how grateful I am for these videos. You teach all of us what our professors failed to. God bless and Thank you so much 😄
6:57 actually this matrix transforms "the grid Jennifer sees" into "the grid we see", since everyone sees his own basis as the standard unit grid. In this way, the matrix transforms a vector "in her language" to "our language". This would be a more natrual way to describe it. : )
thanks man i was confused at this point but your insight cleared my doubt
You taught us a word (A M^-1 A) that looks like a terror organization name could actually gives empathy to the others :) Thanks for creating such content sir
I LOVE YOU, YOU JUST GAVE MEANING TO MY LIFE, THANK YOU !!!
what a great world we live in today all you need is just a click to get the information you want . god bless you . hats off to you man .
Fun fact: The inverse matrix is our unit vectors ((1,0), (0,1)) expressed in Jennifer's language as columns.
You just saved my Linear Algebra life. I am finally understanding what two hours of classroom lecture could not teach me. Thank you so much!
5:32 I love how the left pi student is looking to the side to the question like *cough*Of course I've watched chapter 3*cough cough*
I have to admit, it's beautiful how you've described similar matrices with empathy. I would have loved if you had gone into covectors here too, they are natural when you get to this point. I always mix up the way the way each transformation matrix works...is there any way to remember that more easily? I learned all of this the hard, computing and strict way.
Haha didn't notice that, that's very cool haha
I have seen twice this video...with pause and play.....and remains in thinking of whole day that your explanation is so wonderful.....
I love how you color the brackets at 10:58. I remember this part as one of the more difficult parts of linear algebra, but the way you describe it makes it really easy to understand :)
You cannot believe how thankful I am for the existence in this video. I really hope wherever you are in the world, you have a peaceful life full of wonder and learning and much deserved happiness.
This series is awesome. I am trying to understand deeper, and have a question: At 9:50, you say, "How would Jennifer describe this same 90-degree rotation of space?", I think in Jennifer's space, this rotation is not 90-degree(even not a rotation). Maybe we could say, "How would Jennifer describe a transformation which from our perspective is a 90-degree rotation?" Thank you!
I totally agree with you. The video part after 9:50 explains how Jennifer would see our 90-degree transformation. But if we want to see Jennifer-90-degree transformation in our perspective, first we should rotate the general (x,y) vector by 90-degree, ie, by multiplying by that [0,1,1,0] matrix and then converting it to our grid by A-inverse
Sida Liu No, if you look at the angles it's still 90 degrees, unless her measurement of angles is messed up now
Deepto Chatterjee - maybe her angles are messed up!
If you look into an arbitrary vector space (over the real or complex numbers), sometimes you can develop a notion of an _inner product._ An inner product is a generalization of the notion of the dot product. It is an operation that takes two vectors in your vector space as an input and outputs a real number, subject to certain properties. A vector space together with a specified inner product is called an inner product space.
In an inner product space, we _define_ the length of a vector to be the square root of the inner product of that vector with itself; symbolically, ||v|| = √(in(v,v)). And we _define_ the angle between two vectors to be the arccosine of the inner product of those vectors divided by the product of the lengths of those vectors; symbolically, θ = arccos(in(v,w)/(||v||∙||w||)).
Knowing this, if Jennifer decides to use the "dot product" with respect to her basis vectors (which would be an inner product on R^2), then her basis vectors would appear to her to be orthogonal and of length 1! And our basis vectors would appear to her _not_ to be orthogonal or of length 1.
Of course, Jennifer could also choose a different inner product from the "dot product" with respect to her basis vectors, in which case her basis vectors may not be orthogonal or of length 1 using that inner product.
Nope! Here, everyone sees space the same way. A 90 degree rotation to us is a 90 degree rotation to Jennifer. What's different is how she DESCRIBES space. So, she needs a different matrix to describe a 90 degree rotation.
@@rynin8019 6 months late, but from the video, what looks like a 90 degree angle in our space looks like a greater than 90 degree angle in space from her perspective. So while the rotation is equivalent in both coordinate systems, I am not sure we all "see space the same way."
Until now, I've used A-1XA not knowing why even during the math class in my university. This expressions suggests take any vector into our perspective and transform with M, then change the viewpoint back to the original one. So impressive!! Thanks a lot
I wouldn't use a computer to compute the inverse. The mechanical, generic computation is complex and computationally expensive, while its often possible to work out a much simpler and more efficient set of computations on paper -- thus, it often makes more sense to find the inverse matrix by hand and then program the computer to use it. At least, this is how I handled inverting the projection matrix used in 3D rendering.
I think the point he makes is more general. It may be easier for you to invert your matrix by hand, but what if you need to invert a 1000x1000 matrix whose values change depending on a previous set of steps? Do you then have the system print out that matrix and wait for your input to provide it the new inverted matrix? In reality, you should work to form an equivalent statement that does not require you to invert a matrix wherever possible (find symmetric matrices maybe) and then bring down the number of computations you'd have to do. However, It is almost never worth inverting these matrices by hand.
6:44 I wrapped my head around it by considering active vs passive transformations.
- As an active transformation, the matrix maps every vector in our coordinate system, to the corresponding vectors of Jennifer's system, without changing our point of view, so our coordinate system stays constant. The entire grid of vectors has changed on its own, thus the name 'active' transformation. It's like moving an object you're filming without changing the position of an camera.
- As a passive transformation, we're changing our point of view backwards, from Jennifer's to our coordinate system, so we're kind of translating everything described in Jennifer's language to our language, without changing the true position of vectors. You can think of it as moving the camera itself while not touching the object.
No matter in which way you view the transformation, the math behind it stays pretty much the same. And also I think it'd be a lot easier for us to grasp the 'backwards' phenomenon if you could introduce the above in the video.
This is a nice way of looking at it!
Its quite confusing to talk about translating vectors as converting them between coordinate systems. When I think "vector translation", I think about adding vectors together, and thus shifting them.
Great video though, I wish I could like it multiple times... really man, you deserve more for this awesome work!
Yeah me too
Even the quotation at the beginning makes a lot of sense when you watch the video a second time! You craft your videos so well, Grant! You're amazing!!
Thank you for helping me actually grasp this concept. Amazing video!
Grant says at 6:48 : "Geometrically this matrix [[2 1], [-1 1]] transforms our grid into Jennifer's grid but numerically it's translating a vector described in her to our language."
I think a better interpretation is to think that the matrix [[2 1], [-1 1]] expresses her perspective in our perspective. That is, the matrix [[2 1], [-1 1]] does not transform our grid into her grid, rather it EXPRESSES her grid in our grid system. It is as if her grid is overlaid or superimposed on our grid, and we are reading off from our grid anything that she expresses with her grid. We are still in our grid. A weak analogy would be that we are wearing correcting lenses (I.e the matrix [[2 1], [-1 1]]) which translates her worldview into our worldview.
I think that here we cannot speak of transformation like before, because here the two grid systems are coexistent and simultaneous.
When we adopt this interpretation, everything seems natural and logical.
My thoughts exactly.
9:15, it's be more useful if you make it specific that the 90 rotation transformation is a transformation in OUR coordinate system but NOT Jennifer's, if it's in Jennifer's there's would be no need for all that formula at all.
I had the same doubt, I thought shouldn't Jennifer multiply the same rotation matrix if she wants the vector in her coordination system to be rotated by 90 degrees.
This took me a while to figure out too, I think it's one part that could've been better explained. It finally hit me that the two vectors in some arbitrary basis might not be 90 degrees from each other, so the orthonormal basis transformations will be encoding 'flips' of basis vectors that are NOT 90 degree rotations. To get what a 90 degree flip is you need to find out your basis vectors are in terms of her basis and then flip them according to the result.
Grant, you know the (A^-1)(M)(A) explanation you added at the end. You are truly something else!!!
You're blowing my mind here, dude.
MemeLord why are crying?
too many dead memes
You have no idea what this videos means to me and the impact they have had on my life. Thanks
So what's the difference between change of basis and linear transformation?
Linear transformation works in one basis system (coordinate system). A change of basis is a transformation from one coordinate system to another. They are similar in that you can use a linear transformation to define your new basis vectors. They are both the same thing, it depends on the context
Hence the intro quote
Put another way, a change of basis is an _application_ of a linear transformation to change coordinate systems. Sometimes a problem is easier to reason about in a different coordinate system, and a specific linear transformation lets you do that.
0:00 intro
0:13 coordinates as scalars
1:17 change of basis vectors
3:23 the grid is a construct
4:11 how do you translate between coordinate systems?
7:15 what about the other way around?
8:27 in a nutshell
8:51 how to translate a linear transformation
12:22 outtro
0:04 I usually think the opposite: Mathematics is giving different names to the same thing.
Every Linear Algebra professor should start their lessons with these videos. Outstanding!
"Space itself has no intrinsic grid"
Einstein: "Hold my Energy Stress Momentum Tensor"
Tensors are dependent on the choice of basis though :)
I love the detail he adds to the videos. Just look at the color transition in the matrix which represents the change in basis. Kudos to his involvement and patience to work on all these details..!!! :)
why on earth jeneffer needs different basis?
The whole empathy part AND HOW MUCH SENSE THAT MAKES is honestly moving
If both [1, 2] and [-1, 1] are in Jennifer's space, why is their angle not equal to 90 degrees? Does a 90-degree angle in her space not use the same definition as ours?
Great question!
Almost all of the time, the angle between a pair of vectors that are described with some basis B will change whenever that same pair of vectors is described with a new basis B'
His next video in this series, the one on eigenvectors and eigen values, explains this concept well!
This throws a whole new light to what I’d been taught - but had barely conceptually understood - in my mathematical methods in physics class. Thank you so much. Bless you. 💖
Yep. Me. Loner with no life watching linear algebra on Christmas.
Man, you are next level stuff. Literally impressed by your genius. Hands down! I wish I had teachers like you. Take a bow. Massive respect.
I love you.
Thank you, for making mathematics as elegant as thinking in your first language. :)
I'm a huge fan of your channel and how I fall in love with any topic you cover in your videos. I hope to be a teacher like you if and when I become one.
3:20 I knew the French where behind all this nonsense !
You literally are one of the best teachers in the world... We need more of your videos, keep up the good work!!
The subscript doesn't match with the speech in this video (13)
I freaking love ur videos. This makes me wonder what the heck was I and my professor doing in my linear algebra class. I can't even express how much I'm impressed by the intuition I gain by watching these videos. It's like a whole new world!
Wait. You call them Eigenvektors in English? That's straight out of German! And Eigenvalues translates only the last part of the word into english... Interesting. Why is that? Was it a German that standardized this part of mathematics?
The "Eigen" is used in English. This is because Hilbert coined the prefix "Eigen-" for characteristic (intrinsic, coordinate independent) properties. (faql.de/etymologie.html#eigen )
I guess he was the one who really started to figure out "Eigen-"properties on a higher, more abstract level than just linear algebra.
Probably not. There haven't ever been any German mathematicians of note... ever. 😀
iSquared Trolling, aren't you? What about Riemann, Gauß, Euler, Hilbert or Cantor?
+Elchi King (Maddemaddigger) Relax, I was making a harmless joke. Half of mathematics comes from Germany haha.
iSquared while mathmethicans isn't what Germany is most famous for (other than physicists, but these all had to be good in mathematics too btw), there certainly are Germans that are famous for their mathematic archivements. Ever heard of Gauß (or like you would probably write it Gauss)?
The empathy analogy is probably the best intuition I had about linear algebra ever.
Your course covers for memorizing equations of thousands of students.
Thank you.
3Blue1Brown1Pink
Exactly!
hahaha I didn't see that :P