This was an extremely helpful way of explaining the reason eigenvalues and eigenvectors matter. These videos are the best! Coming from a comp sci perspective, tying in the fact that this is reducing down to a problem I can feed to a computer (so I know where this is headed) is super useful. (It's the entire reason I'm doing this hahaha)
Great lectures as always, Professor Grinfeld. Your teaching style always amazes me. There is a brilliant synergy between this series and the one you created for tensors. At this point, I can't help but think of coordinate transformations, the Jacobian and the Christoffel symbol. I think it will all come together in my mind at some point. I could be wrong but I need to think this through a bit more and I would love to be able to put some of the ideas from tensors in this linear algebra framework and then view it from another perspective. :)
The characteristic vectors and values are not explained by most educators as the simplest (characteristic) basis of the transformation. It has taken a long time to find out what these things are.
I saw Lemma has Gerald Sussman as an advisor. Wonderful man and teacher, good job. I hope Lemma gets bigger and better, seeing wide adoption. And maybe, just maybe, in the future it can integrate a modern SICP curriculum as well, huh?
When you choose a basis, is it preferred that my basis vectors are all orthogonal to each other ? And are there any cases where you might find orthogonal eigenvectors?
I have to watch these videos again and again, with this video I finally understood what eigenvectors and eigenvalues are useful for. Why didn't I come up with this idea by myself? Because I'm too stupid.
I'm not sure if I'm answering the right question, but yes any vector *v* creates its own subspace consisting of all multiples of the vector *v*. But this has nothing to do with eigenvalues and eigenvectors.
MathTheBeautiful I read through a definition of a basis and I was mixing up the basis for a subspace and the linear combination of a particular vector with respect to its basis. So, the subspace of a vector is just a line. In this video, it looks like the basis is {e1, e2, e3} and v= α1 e1 + α2 e2 + α3 e3 is a unique vector chosen from the space. In general, do you get the entire space by varying the coefficients on e1, e2, and e3 in all possible ways and collecting all the unique combinations of those vectors? If so, it seems like when you apply T to a particular vector you can find out how the entire space transforms because the space is generated from linear combination of the basis vectors. And if the basis happens to be the set of eigenvectors for a space, the transformation is made easier because you can apply the corresponding eigenvalues to each of the components of a vector as a shortcut and it will be the same as applying T to each of those components. -------------------------------------------------------------------------------- Basis Definition. Let V be a vector space. A linearly independent spanning set for V is called a basis. Equivalently, a subset S ⊂ V is a basis for V if any vector v ∈ V is uniquely represented as a linear combination v = r1v1 + r2v2 + · · · + rkvk , where v1, . . . , vk are distinct vectors from S and r1, . . . , rk ∈ R. -------------------------------------------------------------------------------- A basis for V is a spanning set for V, so every vector in V can be written as a linear combination of basis elements, and that linear combination is unique. Lemma. Let {v1, v2, ... vn} be a basis for vector space V. Every v ∈ V can be written in exactly one way.as v = r1v1 + r2v2 + · · · + rkvk
Merrill Hutchison Yes to every statement you made! Small terminology note: a straight line is the *span* of a vector (and only if it's a nonzero vector).
I watched this for four minutes out of twelve, and saw the presenter not doing any math, but simply using words. Which is great for some folks, but I want to see the points illustrated with some math. This is a peculiar presentation for a math class, and I stopped watching it because I didn't feel I was learning anything from it.
Go to LEM.MA/LA for videos, exercises, and to ask us questions directly.
This was an extremely helpful way of explaining the reason eigenvalues and eigenvectors matter. These videos are the best! Coming from a comp sci perspective, tying in the fact that this is reducing down to a problem I can feed to a computer (so I know where this is headed) is super useful. (It's the entire reason I'm doing this hahaha)
A truly mind blowing explanation of the subject. Had long been looking for clarity more than the crystal itself!!!
one of the most profound lessons in linear algebra .. I must have missed this 20 years ago
It's all good! Happy you found it now!
Thank you for all your hard work in making these videos.
Thank you so much! I wonder why, after 3(!) linear algebra courses (and several books) no one has characterized eigenvalues in this light. You rock!
Great lectures as always, Professor Grinfeld. Your teaching style always amazes me. There is a brilliant synergy between this series and the one you created for tensors. At this point, I can't help but think of coordinate transformations, the Jacobian and the Christoffel symbol. I think it will all come together in my mind at some point. I could be wrong but I need to think this through a bit more and I would love to be able to put some of the ideas from tensors in this linear algebra framework and then view it from another perspective. :)
Much appreciated, as always.
The characteristic vectors and values are not explained by most educators as the simplest (characteristic) basis of the transformation. It has taken a long time to find out what these things are.
I saw Lemma has Gerald Sussman as an advisor. Wonderful man and teacher, good job. I hope Lemma gets bigger and better, seeing wide adoption. And maybe, just maybe, in the future it can integrate a modern SICP curriculum as well, huh?
Thank you Pavel ! HNY!
Thank you, you too!
this is so good and informative
When you choose a basis, is it preferred that my basis vectors are all orthogonal to each other ? And are there any cases where you might find orthogonal eigenvectors?
Nicely done! Thanks!
I have to watch these videos again and again, with this video I finally understood what eigenvectors and eigenvalues are useful for. Why didn't I come up with this idea by myself? Because I'm too stupid.
That is so clear !
What is the physical meaning of an egienvalue that has a value of 1?
Depends on the problem! For example, in the case of rotation in 3D, this eigenvalue/eigenvector corresponds to the axis of rotation.
thank you!
Thank you!
Excellent
If each e_i is an eigenvector and each eigenvector is linearly independent, does the vector v create a subspace by itself?
I'm not sure if I'm answering the right question, but yes any vector *v* creates its own subspace consisting of all multiples of the vector *v*. But this has nothing to do with eigenvalues and eigenvectors.
MathTheBeautiful
I read through a definition of a basis and I was mixing up the basis for a subspace and the linear combination of a particular vector with respect to its basis.
So, the subspace of a vector is just a line.
In this video, it looks like the basis is {e1, e2, e3} and v= α1 e1 + α2 e2 + α3 e3 is a unique vector chosen from the space.
In general, do you get the entire space by varying the coefficients on e1, e2, and e3 in all possible ways and collecting all the unique combinations of those vectors? If so, it seems like when you apply T to a particular vector you can find out how the entire space transforms because the space is generated from linear combination of the basis vectors.
And if the basis happens to be the set of eigenvectors for a space, the transformation is made easier because you can apply the corresponding eigenvalues to each of the components of a vector as a shortcut and it will be the same as applying T to each of those components.
--------------------------------------------------------------------------------
Basis Definition.
Let V be a vector space. A linearly
independent spanning set for V is called a basis.
Equivalently, a subset S ⊂ V is a basis for V if any
vector v ∈ V is uniquely represented as a linear
combination
v = r1v1 + r2v2 + · · · + rkvk , where v1, . . . , vk are distinct vectors from S and r1, . . . , rk ∈ R.
--------------------------------------------------------------------------------
A basis for V is a spanning set for V, so every vector in V can be written as a linear combination of basis elements, and that linear combination is unique.
Lemma. Let {v1, v2, ... vn} be a basis for vector space V. Every v ∈ V can be written in exactly one way.as v = r1v1 + r2v2 + · · · + rkvk
Merrill Hutchison Yes to every statement you made!
Small terminology note: a straight line is the *span* of a vector (and only if it's a nonzero vector).
Isn’t this the guy from the office?
Yesh
I'm learning about eigen values from Kane
I'm 5'10
I like your dance.
haha i also
I watched this for four minutes out of twelve, and saw the presenter not doing any math, but simply using words. Which is great for some folks, but I want to see the points illustrated with some math. This is a peculiar presentation for a math class, and I stopped watching it because I didn't feel I was learning anything from it.
Jeez, the blinking eyes are distracting