The problem P(A)=0 is trivial whenever P is split with simple roots because then A is diagonalizable. (A-aI)²=0 is not that hard either if you can solve A²=0.
Sweet job Dr. Peyam! For those who might be scratching their heads about this, Dr. Peyam accidentally said nontrivial instead of trivial at first (unless I misheard or misunderstood him).
there are also such gem matrices as ±[0,1;1,0], the case of a "flip" matrix, which inverts the order of rows on one side or columns on the other. Or the general case of any permutation of vectors with only one dimension being 1 and which can be decomposed into smaller matrices of the same form. The general case for some A: -A, [1,0;0,A], and [0,A;A,0]. Given A^2 = i: (-A)^2 = (-1)^2 A^2 = i. [1,0;0,A]^2 = [1,0;0,A^2] = [1,0;0,i] = i. And finally [0,A;A,0]^2 = [A^2,0;0,A^2] = [i,0;0,i] = i
technically the number 1 is part of the i case, and [1,0;0,A] should be replaced with [A,0;0,B], which squares to [A^2,0;0,B^2] = [i,0;0,i] = i, for 0 can also be the zero matrix (of any size necessary)
After you mentioned P and P^-1, this gave me an idea which generalizes things much farther (without doing PP^-1): the three cases are: set(all A) contains: -A, [A1,0;0,A2], and [0,kA;A/k,0] for any nonzero k.
actually i found that there are matrices which this doesn't cover: something like sqrt(1/2) * [1,1;1,-1], which squares to i. an additional condition is needed: set(all A) contains (A1 + A2) whenever A1A2+A2A1 = 0, which is much harder to recursively define since AB+BA = 0 scenarios are a bit... tricky.
I think T = T inverse right? any linear transformation that is its own inverse are cases. In terms of R^n it's just a rotation or a reflection. Reminds me of Pauli Spin matricies
Just a question But first of all good video as always but what if you are working over K-Vector space where K is a field such that 2=0 ? Because you won't be able to show that the V is in direct sum since you are using the fact that 2x=0 implies that x=0 Then in this case (If K is such that 2=0) does it implies that if A*A=I then A = I ?
@@drpeyam I would prefere more arithmetic lectures. The essence of numbers is the hard deep core of maths. Diophant, primes, etc. Or as C.F.Gauss said: "When maths is the queen of science, then arithmetic is the queen of maths."
Dr Peyam is amazing! These are such a pleasure to watch.
Next video: A satisfies an arbitrary quadratic equation :-)
The problem P(A)=0 is trivial whenever P is split with simple roots because then A is diagonalizable. (A-aI)²=0 is not that hard either if you can solve A²=0.
Sweet job Dr. Peyam! For those who might be scratching their heads about this, Dr. Peyam accidentally said nontrivial instead of trivial at first (unless I misheard or misunderstood him).
Thank you Dr Peyam! Awesome class!
there are also such gem matrices as ±[0,1;1,0], the case of a "flip" matrix, which inverts the order of rows on one side or columns on the other. Or the general case of any permutation of vectors with only one dimension being 1 and which can be decomposed into smaller matrices of the same form. The general case for some A: -A, [1,0;0,A], and [0,A;A,0]. Given A^2 = i: (-A)^2 = (-1)^2 A^2 = i. [1,0;0,A]^2 = [1,0;0,A^2] = [1,0;0,i] = i. And finally [0,A;A,0]^2 = [A^2,0;0,A^2] = [i,0;0,i] = i
technically the number 1 is part of the i case, and [1,0;0,A] should be replaced with [A,0;0,B], which squares to [A^2,0;0,B^2] = [i,0;0,i] = i, for 0 can also be the zero matrix (of any size necessary)
After you mentioned P and P^-1, this gave me an idea which generalizes things much farther (without doing PP^-1): the three cases are: set(all A) contains: -A, [A1,0;0,A2], and [0,kA;A/k,0] for any nonzero k.
actually i found that there are matrices which this doesn't cover: something like sqrt(1/2) * [1,1;1,-1], which squares to i. an additional condition is needed: set(all A) contains (A1 + A2) whenever A1A2+A2A1 = 0, which is much harder to recursively define since AB+BA = 0 scenarios are a bit... tricky.
I think T = T inverse right? any linear transformation that is its own inverse are cases. In terms of R^n it's just a rotation or a reflection. Reminds me of Pauli Spin matricies
Yep
Hi Peyam, Why it is an unlisted video? Your characterization has a nice geometric description.
It will be published at some future time
Very beautiful
Or you can do it using the jordan canonical form
Just a question But first of all good video as always
but what if you are working over K-Vector space where K is a field such that 2=0 ?
Because you won't be able to show that the V is in direct sum since you are using the fact that 2x=0 implies that x=0
Then in this case (If K is such that 2=0) does it implies that if A*A=I then A = I ?
Mir Sami What about the matrix
0 1
1 0
?
But in that case you can directly figure out which A works because there are only 16 possibilities for A
There are 16 possibilities for 2x2 matrices and when the base field is Z/2Z only. Linear algebra on fields of characteristic 2 is very tricky...
Hey is this undergrad level or postgrad level? Because I'm in high school.
Linear algebra is mostly taught in first year of most math related career
If you have the curiosity, it doesn't matter in which grade you are in, ... just follow your "in-drive" and everything will be allright
Symplectic Matrices are beauties. Check them out.
Please upload AB=I 🤗
both squar matrices
B = A^-1
Amazing!
A^3 when?
What is name of this matrix A²=I?
Idempotent
@@drpeyam thx sir
Υπέροχο
Every Matrix that is its own inverse, satisfies A²=I
And conversely
@@drpeyam
I would prefere more arithmetic lectures.
The essence of numbers is the hard deep core of maths.
Diophant, primes, etc.
Or as C.F.Gauss said:
"When maths is the queen of science, then arithmetic is the queen of maths."
@@Handelsbilanzdefizit functional analsys way cooler than arithmetic
Yeah, I’m not a fan of arithmetic! Analysis is way cooler
This guy has good drugs