Great video! I’ve thought about this problem before and, although I didn’t come up with a general solution, I did come up with this: Any matrix A such that A^2=A is either the identity or non-invertible. A=(A^2)(A^-1)=(A)(A^-1)=I=A, if it is invertible. You said ((0,1),(1,0)) works but it doesn’t because it squared is the identity
So cool! Being an engineering major, I never realized how beautiful Linear Algebra was until I started watching your videos. If I could do it over, I'd major in mathematics :)
Hi, if you are so interested in mathematics and you feel that it is what makes you happy don’t give up: I started majoring in mechanical engineering and after a year and a half in engineering I decided to double major in maths because of the beauty in found in linear algebra and partial differential equations especially fourier transform and series and by making a good schedule I managed to complete both in majors in just an additional regular semester and a summer semester. I can still remember the happiness that I felt when I made that decision and made that it is doable without losing a lot of time and how I started jumping in my room out of joy😝 Believe me life is too short to not follow your dreams. I still haven’t graduated but until now the schedule is working just fine. Thanks a lot for reading and I hope I helped you make some decisions about this😊
I started as an engineering major. Loved the maths, physics and chemistry classes. Hated almost all of the engineering classes. Changed major to math after Diff Eq, and only had to take 2 stats classes to catch up with the major requirements.
Well, sorry if I misunderstood something, but did you just claim that A = {{0,1},{1,0}} satisfies the A = A^2 property??? (Leftmost matrix on the whiteboard at the end). It clearly doesn't, since its square is the identity matrix, not itself.
at 12:46 A = P D P^-1 looks like a sum of outer products of the first n = trace(D) columns, of P, but with: sum_n{ pp^-1 } (p^-1 being a row of P^-1), rather than: sum_n{ pp^T }
Such quality content! I'm looking forward to seeing more. I think I recall you mentioning you do work in PDEs so I wanted to ask if you could recommend any textbooks for applications of linear algebra theory / functional analysis to the finite element method / PDEs? So far I've mostly been reading S Salsa and M Holst. Thank you!
I like this because it also says that if T^2=T and T is not the zero transformation, then it will have an eigenvector with nonzero eigenvalue, because if it doesn't then the diagonal will be all zero, so any similar matrix will be the zero matrix. This means there is some non-trivial v with Tv=v.
Wow Dr. Peyam! What a amazing video!!!. Recently I was trying a problem that came up in class and I am sure that the only person in all youtube that can solve it, is you. The problem is to find all the "roots of the unity" in the sense of the matrices, that is, with an n fixed, find all the square roots of the identity matrix of order n, all the cubic roots, etc. I think that problem could be interesting for your videos. Greetings from Colombia.
@@drpeyam yes, but more generally from m to infinity (where m is a positive integer) for both bounds. BPRP and I could only think of the 0 function and step functions as solutions.
f(n) must equal the integral from x=n to n+1 of f(x), where n is a positive integer. If this condition is met, and the sum/integral converge, it should work. (Also, step functions do not meat the criteria, as they are not continuous)
@@thomaspeck4537 yes, my question to bprp was is there ANY function that satisfied this property and those were the solutions. After that we added the restriction that it needed to be continuous
@@thomaspeck4537 my hypothesis is that there are no non-piecewise functions that satisfy this property. You could, in theory, construct a piecewise function that satisfied the continuity restraint, but I'm not sure of any non-piecewise functions that could do this.
@@drpeyam my bad, i didn't understand it first time when i watched. Let me see if i understood it corectly, having a vector space V and a linear trasformation with the property that T²=T you can decompose V into 2 subspaces, null of T and fixed point of T. Is this it?
@@drpeyam In your video, you made the claim that V=N(T)+F(T), where F(T) is the same as the image of im(T) in our case. Did you prove this claim? Does this claim follow from A^2=A? If not, I don't think you can use this claim to say all A that satisfy A^2=A have the form you mentioned.
@@drpeyam Thank you so much. Last thing I wonder: For a to be a projection, A^2=A along already implies A is symmetric or that A being symmetric is required additionally? From your conclusion, it seems like the former is true since you did not make use of the fact that A is symmetric.
Have you ever thought about solving for the analytical solution to the non-linear pendulum problem? There exists a solution for it but it's pretty complicated
A is an idempotent matrix, x is a vector. (A^2)x=Ax (A^2-A)x=0 A(A-I)x=0 Ax or (A-I)x is equal to the zero vector. Therefore, Ax is in the fixed point space or null space. Sometimes it pays to factor Edit:forgot zero divisors in the ring of matrices.
umm what about a 2x2 matrix which is all 1/2? It also equals itself squared... I found infinitely more solutions than that BTW, just wondering what you think about that (I solved it algebraically...)
That’s also ok, since the range is part of the nullspace here. And everything is true up to similarity, so a different choice of P gives you a different answer
This video is unnecessarily lengthy. A^2 =A mean that A satisfies the polynomial x(x-1). So, the minimal polynomial of A is either x, or x-1, or x(x-1). In the first case, A is the zero matrix. In the second case, A is the identity matrix. In the third case, since the minimal polynomial is a product of distinct linear factors, A is similar to a diagonal matrix (using primary decomposition theorem) with diagonal entries 0 and 1 (with possible duplication).
I solved this using that the eigenvalues of A² will be the squares of eigenvalues of A with same corresponding eigenvectors. Let e be eigenvalue of A, then e²=e implying that the only eigenvalues of A are either 0 and 1 implying all conjugates of matrices with zero non-diagonal entries and with only 1 being non-zero diagonal entries are all the solutions.
I'd like to see this tied into the projective form A(A^TA)^-1A^T and compared to the pseudoinverse with an ordinary least squares linear regression example.
@@EpicMathTime even if A isn't the zero matrix or the identity, wouldn't Ax or (A-I)x still be in the null space because of what was said in the video?
.. and the awnser is: T is a projection! But Dr. Peyam, what is a projection? Well... Its a linear transformation P that satisfies P^2 = P of course...^^ (en.wikipedia.org/wiki/Projection_(linear_algebra)) cool video though :)
“I am an analysts but I like linear algebra because it doesn’t have much algebra involved.” 😂
Pure gold indeed!
when I see this question, I just think that A = 0 or 1
"Find them!" "Which ones??" "ALL OF THEM!!!!"
Great video!
I’ve thought about this problem before and, although I didn’t come up with a general solution, I did come up with this:
Any matrix A such that A^2=A is either the identity or non-invertible. A=(A^2)(A^-1)=(A)(A^-1)=I=A, if it is invertible. You said ((0,1),(1,0)) works but it doesn’t because it squared is the identity
Thanks for watching!!
Hey I'm ur fan
4 Elements hi and thank you!
So cool! Being an engineering major, I never realized how beautiful Linear Algebra was until I started watching your videos. If I could do it over, I'd major in mathematics :)
Hi, if you are so interested in mathematics and you feel that it is what makes you happy don’t give up: I started majoring in mechanical engineering and after a year and a half in engineering I decided to double major in maths because of the beauty in found in linear algebra and partial differential equations especially fourier transform and series and by making a good schedule I managed to complete both in majors in just an additional regular semester and a summer semester. I can still remember the happiness that I felt when I made that decision and made that it is doable without losing a lot of time and how I started jumping in my room out of joy😝 Believe me life is too short to not follow your dreams. I still haven’t graduated but until now the schedule is working just fine. Thanks a lot for reading and I hope I helped you make some decisions about this😊
I started as an engineering major. Loved the maths, physics and chemistry classes. Hated almost all of the engineering classes.
Changed major to math after Diff Eq, and only had to take 2 stats classes to catch up with the major requirements.
Hello dr.payem how many hours do you give math per day
I’d say about 6 hours per day, including Sundays
Well, sorry if I misunderstood something, but did you just claim that A = {{0,1},{1,0}} satisfies the A = A^2 property??? (Leftmost matrix on the whiteboard at the end). It clearly doesn't, since its square is the identity matrix, not itself.
It doesn’t, I misspoke
Excellent! I'm totally amazed you can make these videos practically every day while still doing your day job!!!
The semester is over
at 12:46 A = P D P^-1 looks like a sum of outer products of the first n = trace(D) columns, of P, but with: sum_n{ pp^-1 } (p^-1 being a row of P^-1), rather than: sum_n{ pp^T }
It's nice to start whatching your video and find you telling us Thanks for Watching
Does this also mean that those kind of matrices can only have 1 as an eigenvalue?
Yes, but 0 is also a possible eigenvalue
Such quality content! I'm looking forward to seeing more. I think I recall you mentioning you do work in PDEs so I wanted to ask if you could recommend any textbooks for applications of linear algebra theory / functional analysis to the finite element method / PDEs? So far I've mostly been reading S Salsa and M Holst. Thank you!
I like this because it also says that if T^2=T and T is not the zero transformation, then it will have an eigenvector with nonzero eigenvalue, because if it doesn't then the diagonal will be all zero, so any similar matrix will be the zero matrix. This means there is some non-trivial v with Tv=v.
Great Vid. It's so much easier to follow you now. The viewing angle is great and using a white board makes a huge difference. Thank you Dr P.
Wow Dr. Peyam! What a amazing video!!!. Recently I was trying a problem that came up in class and I am sure that the only person in all youtube that can solve it, is you. The problem is to find all the "roots of the unity" in the sense of the matrices, that is, with an n fixed, find all the square roots of the identity matrix of order n, all the cubic roots, etc. I think that problem could be interesting for your videos. Greetings from Colombia.
The case n = 2 will be in a future video
Wow, that's so cool! I love linear algebra, I'm sad that my class is over! Thanks for the video Dr.P!
This was informative. Thank you, again.
Can you think of any continuous function that has the property that its infinite summation and improper integral converge to the same value?
What do you mean by infinite summation? Sum from 1 to infinity of f(n) ?
@@drpeyam yes, but more generally from m to infinity (where m is a positive integer) for both bounds. BPRP and I could only think of the 0 function and step functions as solutions.
f(n) must equal the integral from x=n to n+1 of f(x), where n is a positive integer. If this condition is met, and the sum/integral converge, it should work.
(Also, step functions do not meat the criteria, as they are not continuous)
@@thomaspeck4537 yes, my question to bprp was is there ANY function that satisfied this property and those were the solutions. After that we added the restriction that it needed to be continuous
@@thomaspeck4537 my hypothesis is that there are no non-piecewise functions that satisfy this property. You could, in theory, construct a piecewise function that satisfied the continuity restraint, but I'm not sure of any non-piecewise functions that could do this.
Hello, can you make a video prooving the theorem that you can separate V in the null space of an operator and the fix pount of that operator? Please
I proved it in the video
@@drpeyam my bad, i didn't understand it first time when i watched. Let me see if i understood it corectly, having a vector space V and a linear trasformation with the property that T²=T you can decompose V into 2 subspaces, null of T and fixed point of T. Is this it?
Do we have some analog result like the direct sum for nonlinear transformations ? Or any other interesting properties? I’ve been DYING to find one!
I don’t think so :/
Since A is a “redundant” transformation after A is initially applied, then is it true that if A^2=A then A^N=A for any positive integer?
Absolutely!
F(T) is a set of eigenvectors?
Hi, Dr Peyam. Will you prove that all projection matrixes are symmetric?
That’s part of the def of a projection matrix: A^2 = A and A symmetric
@@drpeyam In your video, you made the claim that V=N(T)+F(T), where F(T) is the same as the image of im(T) in our case. Did you prove this claim? Does this claim follow from A^2=A? If not, I don't think you can use this claim to say all A that satisfy A^2=A have the form you mentioned.
I proved it in the video
@@drpeyam Thank you so much. Last thing I wonder: For a to be a projection, A^2=A along already implies A is symmetric or that A being symmetric is required additionally? From your conclusion, it seems like the former is true since you did not make use of the fact that A is symmetric.
It’s required additionally
Hello, why does {{0,1},{1,0}} work? I calculate that its square is the identity,not itself.
It doesn’t, I misspoke
Cool, does this have any application in other fields, maybe algebraic geometry?
I bet it does! Algebraic geometry *is* the study of geometric properties of solutions of equations, which is exactly what we’re doing here
Thought about checking TH-cam before sleeping -- guess I would be up until a few hours ;)
*has QM flashbacks*
Hi please upload a lecture about sobolev space with some examples
Have you ever thought about solving for the analytical solution to the non-linear pendulum problem? There exists a solution for it but it's pretty complicated
You need elliptical functions.
A is an idempotent matrix, x is a vector.
(A^2)x=Ax
(A^2-A)x=0
A(A-I)x=0
Ax or (A-I)x is equal to the zero vector.
Therefore, Ax is in the fixed point space or null space.
Sometimes it pays to factor
Edit:forgot zero divisors in the ring of matrices.
The step with “Ax or” is incorrect
@@drpeyam woops. You're right. I'll edit it. Thanks.
@@drpeyam I'm pretty sure it's wrong because I forgot about zero divisors but I'm not quite sure how to fix it.
All of this went over my head, but I assume I'll come across it again, so I'm doing my best to follow where I can xD
umm what about a 2x2 matrix which is all 1/2? It also equals itself squared...
I found infinitely more solutions than that BTW, just wondering what you think about that (I solved it algebraically...)
That’s also ok, since the range is part of the nullspace here. And everything is true up to similarity, so a different choice of P gives you a different answer
This video is unnecessarily lengthy. A^2 =A mean that A satisfies the polynomial x(x-1). So, the minimal polynomial of A is either x, or x-1, or x(x-1). In the first case, A is the zero matrix. In the second case, A is the identity matrix. In the third case, since the minimal polynomial is a product of distinct linear factors, A is similar to a diagonal matrix (using primary decomposition theorem) with diagonal entries 0 and 1 (with possible duplication).
I agree with your approach, but that’s assuming one knows what all those terms mean :)
I solved this using that the eigenvalues of A² will be the squares of eigenvalues of A with same corresponding eigenvectors. Let e be eigenvalue of A, then e²=e implying that the only eigenvalues of A are either 0 and 1 implying all conjugates of matrices with zero non-diagonal entries and with only 1 being non-zero diagonal entries are all the solutions.
"what I like to call F(T)... So press 'F' for F(T)" 😂😂
Great video! I really like how you explain stuff. Could you make a video about the Hahn-Banach theorem someday?
I'd like to see this tied into the projective form A(A^TA)^-1A^T and compared to the pseudoinverse with an ordinary least squares linear regression example.
13:44 wait a minute. the vector [0,1;1,0] squares to the identity, not itself
Yeah
1x1 X 1 = 1 , so LWD ALL DOUBLED IS 2 X 2 X 2 = 8 SO DOUBLE THE DIMENSIONS AND VOLUME GOES UP BY 8 AND 8 TIMES THE HORSEPOWER TO PUSH IT
Great video btw.
A^2=A
A^2-A=0
A(A-1)=0
A=0, A=1
I think thats is sort of correct in that the determinant of the matricies are 0 or 1.
Since the ring of matrices has zero divisors, the last step is not valid.
@@EpicMathTime even if A isn't the zero matrix or the identity, wouldn't Ax or (A-I)x still be in the null space because of what was said in the video?
great video!
do you have an official e-mail?
Nope hurts brain
A^2 = A |÷A
A = 1
Check for A=0
-> 0^0 undefined Term
-> A = 1
I know I suck 😳
.. and the awnser is: T is a projection! But Dr. Peyam, what is a projection? Well...
Its a linear transformation P that satisfies P^2 = P of course...^^
(en.wikipedia.org/wiki/Projection_(linear_algebra))
cool video though :)
Isn’t this idempotent matrix?
A symmetric idempotent matrix is a projection operator.
Dr Peyam! Yeeeaaahh!!
I love Linear Algebra
I thought it was well known that a characterisation of any projection p is p o p = p.
He proved the other way around though
Prove parseval theorem if you do not mind
Amazing!!!
Wish I could understand anything being said here but not a surprising answer.
First Like!!!!!!!!!!!!!!!!!
1^2=1
Wtf is happening
❤