Excellent video. As for the question at the end, in general the answer should be know. Since the eigenvalues are the same, so the trace and determinant should also be the same. The trace is not necessarily the trace of the identity and the determinant is not necessarily 1.
i can not give enough likes for these videos. are group theory and linear algebra & tensor analysis sort of different branches or they meet at some point?
as of me, I dont know anything about group theory. couldnt spent more than an hour with tensor analysis. Yet have learned to love Linear Algebra. my point is, u can learn linear algebra independent of other two. Im not sure but perhaps u cant learn the other two without Linear Algebra
So in response to your question at the end, I'm pretty sure that if we had chosen our basis such that the basis vectors were the eigenvectors of the linear transformation each divided by its corresponding eigenvalue, we should get the identity matrix, assuming a non-defective linear transformation? And with a defective linear transformation perhaps we supplement (in the basis) the 'specific' eigenvectors with a number of generalized eigenvectors equal to the defect and then divide those by their respective (repeated) eigenvalues?
Wait, actually, no, the identity matrix would have a determinant of 1 and a trace of 3 which wouldn't be consistent with the eigenvalues multiplying to equal the determinant and addiing to equal the trace. Hmm...going to eat dinner and ponder this further but I'm now leaning toward an answer of no, we cant choose a basis that would produce the identity matrix for a general linear transformation.
Sweet, I was right in my 2nd reply. Just watched the video where you answer the question. Still not sure what happens in the case of a defective transformation (in terms of whether it can be represented as a diagonal matrix, but I suspect it can be by using the generalized eigenvectors).
It was fun reading your comment. I went through a similar thought at first. My immediate response was let's just scale the eigenvectors by the corresponding eigenvalues. So if Av=lambda*v then let's choose u=lambda*v as the new basis (I was naively hoping that the eigenvalue will now be one). The trouble is that now Au=A(lambda*v)=lambda*Av=lambda*lambda*v=lambda*u. So, essentially since Au=lambda*u, we still don't have an eigenvalue that's one. (the eigenvalue is still lambda). If we were to instead try u=(1/lambda)*v then Au=(1/lambda)Av=(1/lambda)*lambda*v=v=lambda*u. In short Au=lambda*u, so the eigenvalue still does not change. This goes back to Professor Grinfeld's comment in one the earlier videos where he said that any multiple of an eigenvector is also an eigenvector, if I recall correctly. Nonetheless, your argument about the trace (and determinant) is more elegant compared to the brute force I used in my reasoning lol
Go to LEM.MA/LA for videos, exercises, and to ask us questions directly.
Excellent video. As for the question at the end, in general the answer should be know. Since the eigenvalues are the same, so the trace and determinant should also be the same. The trace is not necessarily the trace of the identity and the determinant is not necessarily 1.
no**
i can not give enough likes for these videos. are group theory and linear algebra & tensor analysis sort of different branches or they meet at some point?
as of me, I dont know anything about group theory. couldnt spent more than an hour with tensor analysis. Yet have learned to love Linear Algebra. my point is, u can learn linear algebra independent of other two. Im not sure but perhaps u cant learn the other two without Linear Algebra
So in response to your question at the end, I'm pretty sure that if we had chosen our basis such that the basis vectors were the eigenvectors of the linear transformation each divided by its corresponding eigenvalue, we should get the identity matrix, assuming a non-defective linear transformation?
And with a defective linear transformation perhaps we supplement (in the basis) the 'specific' eigenvectors with a number of generalized eigenvectors equal to the defect and then divide those by their respective (repeated) eigenvalues?
Wait, actually, no, the identity matrix would have a determinant of 1 and a trace of 3 which wouldn't be consistent with the eigenvalues multiplying to equal the determinant and addiing to equal the trace. Hmm...going to eat dinner and ponder this further but I'm now leaning toward an answer of no, we cant choose a basis that would produce the identity matrix for a general linear transformation.
Sweet, I was right in my 2nd reply. Just watched the video where you answer the question. Still not sure what happens in the case of a defective transformation (in terms of whether it can be represented as a diagonal matrix, but I suspect it can be by using the generalized eigenvectors).
It was fun reading your comment. I went through a similar thought at first. My immediate response was let's just scale the eigenvectors by the corresponding eigenvalues. So if Av=lambda*v then let's choose u=lambda*v as the new basis (I was naively hoping that the eigenvalue will now be one). The trouble is that now Au=A(lambda*v)=lambda*Av=lambda*lambda*v=lambda*u. So, essentially since Au=lambda*u, we still don't have an eigenvalue that's one. (the eigenvalue is still lambda). If we were to instead try u=(1/lambda)*v then Au=(1/lambda)Av=(1/lambda)*lambda*v=v=lambda*u. In short Au=lambda*u, so the eigenvalue still does not change. This goes back to Professor Grinfeld's comment in one the earlier videos where he said that any multiple of an eigenvector is also an eigenvector, if I recall correctly. Nonetheless, your argument about the trace (and determinant) is more elegant compared to the brute force I used in my reasoning lol