"Do what you want with this information. I don't know what this is useful for, and, to be honest, I don't care, because it's just beautiful as it is." Spoken like a pure mathematician. Study math because math is beautiful!
It doesn't need to have any immediate clear uses; it just _might_ turn out to be useful for something at some point, for whatever reason. So math is a little bit like preparing a "toolbox", where things are as general and flexible as possible, just in case they turn out to be needed.
I just finished my intro to linear algebra course and I was hoping to never see anything related to linear again but this was really interesting and fun to watch! What's even better is that I actually understood the steps you were taking.
Cool exercise. It teaches us something about the domain of math and how to explore it. Just a small slip of notation there, though: x^(1/n) is not (1/n)√x it is n√x.
I've been having a bad dwelling anxiety attack and what do I find that saves me from my somber mood? This gem! Genial! The Mad Man did it!I am so happy to see these bizarre beauties on your channel!
I've only studied math until C1 for my business degree, and to be honest, it is not my favorite subject, but is awesome to see how passionate you sound in your videos, keep up the good work, your content is very interesting
I'll have to subscribe after seeing this. I haven't seen the Matrix since college so I will need to go back and review some more of Dr. Peyam's videos.
2:00 there is a fair argument you can make in favor of what you are doing. Essentially, a^b is a left-right association, but at the same time you could find a mathematical use for treating roots and powers differently, as the nth root of x is a power-base ordered phrasing, so you could actually want to use e^(n^-1 ln(x)) for roots, and e^(ln(x) n) for regular powers. In this case, it boils down to convention, as long as it's forever consistent.
I'm a math major and just finished my linear algebra sequences. And let me tell you that I've never dreamt that this could be done. It's weird lol. But beautiful
omg diagonalization is so powerful it seems the main technique in linear algebra invented by Grassman 1848 , the matrix Latin for womb by Sylvester an American Actuary 1848 , with Cayley defining the inverse in the 1860s Another beaudy by Dr Peyam always upbeat and chirpy 😜👏🏿👏🏿👏🏿
Yup, algebra is amazing. It is the most potent form of meta-mathematics that exists, studying decompositions, representations and data compression of structures. It is like a detective game, but within mathematical structures. No math would prosper without algebra ✌
Perhaps one can derive some kind of rule for similar problems? I notice that the matrix that needs to be taken a root of is simply divided by 2 and 4 at the bottom row, which possibly has something to do with the 2's in the diagonal of other matrix (the one above the root symbol). And it also happens to contain 1,2 and 3 in both matrices.
I like it when a math person assumes that we know what he talking about. Sounds like my mathematical physics teacher 40 years ago. I was the only student that liked him. Not stated is that from the power series expansion of any function, the eigenvector matrix and its inverse would be end up adjacent to each other given identity, leaving the diagonal matrix of a particular power.
I'm in 12th class currently and I don't carry much knowledge about matrices in this standard but when I saw the thumbnail of the video I just went crazy and tapped on it immediately....This is a truly wonderful clickbait
There is a little mistake of the video but it is just notation problem. 1/n root of x is equal to x^(1/n). It is actually is x^n. But the video is very entertainment I have subscribed it to your channel and liked this video. :)
The physical significance of the matrix root of another matrix is the one to one mapping of the galaxies of one universe onto its neighboring universe assuming that the mapped universe is invertible. The mapping is unique and conforms to the laws of relativity
I agree with your statement about not caring about what this is useful for. but I do think it would be worthwhile to try to obtain some intuition about what this means. what is the meaning of taking the matrix root of something. very strange but the analysis shows that it works and therefore there is probably some meaning behind it. oftentimes things like this can reveal something about the operation in question. we can view root extraction as something far more general than just an operation on vectors. i think a lot of ppl would appreciate if you'd explain a bit more about why you can just apply a function like ln or e^x to a diagonalized matrix the way you did. i know i didn't understand that bit, but my linear algebra is a bit ancient and weak 🥴
Not partical fan of these number examples since the small computational problems keeps me distracted to see the big picture. I would rather like a more generalized approach, let say a 2x2 Matrix ([a1,a2], [a3, a4]) or even nxn matrix
@@drpeyam It is completely doable to do matrix-matrix exponentials for normal nonsinguar matrices A,B such that A^B = exp(log(A) B). However, I guess the case where A^(B^-1) is just a matter of handwork. Any idea if diagonalization of B will make it doable?
No it's correct. Think of √4, it's the same as 4^(1/2) = 2. This is because 4^(1/2)*4^(1/2) = 4^(1/2+1/2) = 4^1 = 4, so it follows that (4^(1/2))^2 = 4, so it is in fact the square root of 4.
Ok, so if we consider scalars to be 1x1 matrices, then for an nxn matrix, it appears we can define the 1x1 root as well as the nxn root of it. Can this be generalized to any mxm matrix root? Or is there something special about 1 and n in producing the roots?
I believe that there is indeed something special about 1 and n in this context, since we're in an algebra (the algebra of matrices, which is essentially a vector space with an additional product operation, like in a ring), and in this algebra, we can define multiplication either by scalars (1x1 matrices if you like) and other elements of the algebra (nxn matrices). So, in that sense, I can't think of a natural way to generalize this root operation to accept other sizes of matrices
Well, one possible evolution of neural network might be a convolution, somehow, of exponentiation of matrices (i.e., connections between layers), so ,.... it might be VERY useful :D
Couldn't we compute the logarithm of a matrix A=RDR^-1 as log(A)=log(RDR^-1)=log(R)+log(D)+log(R^-1)=log(D) ? I know that this would probably hold only if the matrices commuted, but it could be nice.
@@drpeyam thanks, i ll check that. Also maybe you have some advice to the efficient way of solving the Leontiev Matrix - the linenar equasion in form X(I-A)=Y, with solution X= Y(I-A)^-1, x,y are vectors and A is very big matrix
writing at 1:39, so it's egg on my face if this gets addressed, but don't we need to be a bit more careful about saying exp(X)^Y = exp(XY) when X and Y are matrices? i thought there were some conditions about commutation that had to be satisfied nefore saying that
@@drpeyam Thanks you for the prompt reply sir. specifically I am asking if there is any published book or something. I have studied matrix functions which are extended from real valued functions but I have never seen such thing.
That's great but could eigen value be zero and can you apply such diagonalization for any square matrix? I mean, we have charateristic polynoms for eigenvales and last ones have complexity roots sometimes.
"This is math. We can do whatever we want."
As the beaten to death Thanos meme goes, "reality can be whatever I want" - and this is true in linear algebra where you can choose any basis!
Legendary quote. I’m going to put it at the top of my syllabus
@@angeldude101 Nice, I was just watching some of her videos!
challenge accepted
*let 1 = 2*
Matriz that contains matrix as element
Well… I mean….
"Do what you want with this information. I don't know what this is useful for, and, to be honest, I don't care, because it's just beautiful as it is."
Spoken like a pure mathematician. Study math because math is beautiful!
It doesn't need to have any immediate clear uses;
it just _might_ turn out to be useful for something at some point, for whatever reason.
So math is a little bit like preparing a "toolbox", where things are as general and flexible as possible, just in case they turn out to be needed.
This is totally batshit crazy, I love it
I just finished my intro to linear algebra course and I was hoping to never see anything related to linear again but this was really interesting and fun to watch! What's even better is that I actually understood the steps you were taking.
Exactly how I felt watching this
Cool exercise. It teaches us something about the domain of math and how to explore it. Just a small slip of notation there, though: x^(1/n) is not (1/n)√x it is n√x.
Thank you!!!
@@drpeyam Dr P is always so courteous 😜
From the category of calculus to the category of linear algebra, there is a fully faithful functor. Perhaps contravariant?
Do you pay property taxes for your forehead? That’s a lot of acres man…
Yep
I've been having a bad dwelling anxiety attack and what do I find that saves me from my somber mood? This gem! Genial! The Mad Man did it!I am so happy to see these bizarre beauties on your channel!
Thanks!
Omg thanks so much for the super thanks!!!
This is not madness but mathness
So he should be called Mad Maths!
I have used exponential matrices and the logarithm of matrices before. Writing some kind of matrixth root is just a nice possibility to consider.
I've only studied math until C1 for my business degree, and to be honest, it is not my favorite subject, but is awesome to see how passionate you sound in your videos, keep up the good work, your content is very interesting
WOW THAT IS CRAZY!!!
OK, thank you for blowing my brains out. Linear algebra was one of my favorite subjects in college, but this is exquisite nuts stuff.
So fitting that December is the release month of the Matrix Resurrections!
Funny because the new trailer just released a few hours ago. After all...I still know math fu...
@@citizencj3389 i know! Are you pumped to go see it?
@@devsquaredTV Yeah I just hope it is at least half as good as the first one. I still liked the other two though.
I'll have to subscribe after seeing this. I haven't seen the Matrix since college so I will need to go back and review some more of Dr. Peyam's videos.
Thank you!!!
This is insane in every definition of the word! Great job :)
"I don't know what this is useful for, and to be honest, I don't care" - every mathematician's favorite sentence
2:00 there is a fair argument you can make in favor of what you are doing. Essentially, a^b is a left-right association, but at the same time you could find a mathematical use for treating roots and powers differently, as the nth root of x is a power-base ordered phrasing, so you could actually want to use e^(n^-1 ln(x)) for roots, and e^(ln(x) n) for regular powers. In this case, it boils down to convention, as long as it's forever consistent.
I think you might want roots to still be the inverses of powers, so you need to keep the convention consistent between them.
right is always right
I'm a math major and just finished my linear algebra sequences. And let me tell you that I've never dreamt that this could be done. It's weird lol. But beautiful
I never thought the answer would be this but your explanation was so simple that I got it at almost once thank you for interesting video
omg diagonalization is so powerful it seems the main technique in linear algebra invented by Grassman 1848 , the matrix Latin for womb by Sylvester an American Actuary 1848 , with Cayley defining the inverse in the 1860s
Another beaudy by Dr Peyam always upbeat and chirpy 😜👏🏿👏🏿👏🏿
Yup, algebra is amazing.
It is the most potent form of meta-mathematics that exists, studying decompositions, representations and data compression of structures.
It is like a detective game, but within mathematical structures.
No math would prosper without algebra ✌
Esa es la esencia de un matematico, generalizar los conceptos y las operaciones.
una observación inteligente mi amigo algebraico
"This is Math, we can do whatever we want"! Love it!
I fucking love how much this guy is enjoying himself. King.
I've taken scalar to matrix and matrix to scalar powers before, but never matrix to matrix. Very cool
Perhaps one can derive some kind of rule for similar problems? I notice that the matrix that needs to be taken a root of is simply divided by 2 and 4 at the bottom row, which possibly has something to do with the 2's in the diagonal of other matrix (the one above the root symbol). And it also happens to contain 1,2 and 3 in both matrices.
since its all based on the diagonalised eigenmatrix maybe you can directly use that?
Seriously amazing concept
When he said "[two, minus one, minus three, second]th", I felt that.
Sounds applicable for some tensor calc in GR
Doctor Peyam I absolutely love your videos!! It's so inspiring to see such a knowledgeable man as you at work! It instantly makes me want to study :p
"I don't know what this is useful for, to be honest I don't care, because it's just beautiful as it is"
I think that's something my mother says.
The way to do this is to write X=exp(log(X)), and then use the series expansions for log and exp.
Then you will have to deal with the convergence issues though.
@@hOREP245 Of course but that just falls upon eigenvalues of the matrix.
I like it when a math person assumes that we know what he talking about. Sounds like my mathematical physics teacher 40 years ago. I was the only student that liked him. Not stated is that from the power series expansion of any function, the eigenvector matrix and its inverse would be end up adjacent to each other given identity, leaving the diagonal matrix of a particular power.
It’s because i’ve done countless videos on this, check out my eigenvalues playlist
Esto es otro nivel...muchas gracias por dar luz a la caverna
mad
absolutely crazy love it
I'm in 12th class currently and I don't carry much knowledge about matrices in this standard but when I saw the thumbnail of the video I just went crazy and tapped on it immediately....This is a truly wonderful clickbait
I didn't think he'd actually do it, lol!
"I'm sorry ln(DeGeneres) this is my time to shine" - 😂🤣😅🤣😂🤣😅 I can't believe how much I laughed.
🤣💀
Great job sir
There is a little mistake of the video but it is just notation problem. 1/n root of x is equal to x^(1/n). It is actually is x^n. But the video is very entertainment I have subscribed it to your channel and liked this video. :)
This is insane, I love it
You're incredibly entertaining to watch! Greetings from Italy ✋🍕🔥
What's next? αth derivative of a matrix function with respect to a matrix variable, where α is also a matrix?
Fractional derivative of the curve integral of homological chain complexes of Lie algebras or some other crazy shit lol
@@Wabbelpaddel something that's more likely to be taught at hogwarts, honestly
I just finished my linear algebra final and this… THIS THING! Shows up in my recommended!?
"...because right is always right"
Just a reminder that Dr Peyam is left handed.
The physical significance of the matrix root of another matrix is the one to one mapping of the galaxies of one universe onto its neighboring universe assuming that the mapped universe is invertible. The mapping is unique and conforms to the laws of relativity
Full immersion i m in love
A true mad lad, thanks for this 🤣
I agree with your statement about not caring about what this is useful for. but I do think it would be worthwhile to try to obtain some intuition about what this means. what is the meaning of taking the matrix root of something. very strange but the analysis shows that it works and therefore there is probably some meaning behind it. oftentimes things like this can reveal something about the operation in question. we can view root extraction as something far more general than just an operation on vectors.
i think a lot of ppl would appreciate if you'd explain a bit more about why you can just apply a function like ln or e^x to a diagonalized matrix the way you did. i know i didn't understand that bit, but my linear algebra is a bit ancient and weak 🥴
There’s a video on matrix exponentials that explains this, it basically applies to any function that has a power series
It's a cool exercise on matrix to the power of matrix. It must have an interesting app some day.
This was recommended to me. Im proud of myself
excellent thanx a lot!!
Linear algebra final on Wednesday, this is perfect
What would be the general form of A^B, where A is the matrix
a b
c d
And B is the matrix
w x
y z
?
Left as an exercise to the reader :)
You're a literal god Dr. Peyam
Thanks so much!!!
Bravo, Maestro! Bravissimo! I never even thought of this , let alone how to do it! Live and learn, the Weird!
This is absolutely CRAZY but wonderful!!!
Why didn’t I ever think of this in 6 decades?
I want more insanity!!!
Thanks so much!!!
VERY GREAT EXERCISE SIR
YOU ARE REAL MATHS MASTER SIR
THANK YOU SIR
Right is always right?
Matrixth is my new favorite word
Raiz de uma Matriz. Esse é boa !
At 0:28, did you mean $\sqrt[n]{x} = x^{1/n}$ rather than $\sqrt[1/n]{x} = x^{1/n}$?
Not partical fan of these number examples since the small computational problems keeps me distracted to see the big picture. I would rather like a more generalized approach, let say a 2x2 Matrix ([a1,a2], [a3, a4]) or even nxn matrix
LOL, well good luck with that
@@drpeyam It is completely doable to do matrix-matrix exponentials for normal nonsinguar matrices A,B such that A^B = exp(log(A) B). However, I guess the case where A^(B^-1) is just a matter of handwork. Any idea if diagonalization of B will make it doable?
isn't the equation in 0:35
wrong?
i think it's x^n
No it's correct. Think of √4, it's the same as 4^(1/2) = 2.
This is because 4^(1/2)*4^(1/2) = 4^(1/2+1/2) = 4^1 = 4, so it follows that (4^(1/2))^2 = 4, so it is in fact the square root of 4.
Yeah he accidentally wrote 1/n on the left side
@@bomboid
Thats it
@@ubs7239 oh yeah you're right, I'm sorry, it is the n-th root yeah (or just x^n as you said).
@@bomboid that would make it a very interesting problem!!
This is sooo crazy!!!
Ok, so if we consider scalars to be 1x1 matrices, then for an nxn matrix, it appears we can define the 1x1 root as well as the nxn root of it. Can this be generalized to any mxm matrix root? Or is there something special about 1 and n in producing the roots?
I believe that there is indeed something special about 1 and n in this context, since we're in an algebra (the algebra of matrices, which is essentially a vector space with an additional product operation, like in a ring), and in this algebra, we can define multiplication either by scalars (1x1 matrices if you like) and other elements of the algebra (nxn matrices). So, in that sense, I can't think of a natural way to generalize this root operation to accept other sizes of matrices
That was.................interesting
Ha ha ha ha. This was so giddy fun. Stuff we do with maths
I love this. Thank you very, very much.
Foarte interesant! Care este aplicabilitatea practica?
That was really a funny example!
This is so cool. I used to philosophize about this kind of shit in hugh school and college. Cool to see that it is possible to do. problem like this.
Makes me wonder... can the gamma function be extended to matrices in order to get a smooth matrix factorial? 🤯
This looks like smth you would watch procrastinating at 3am.
Very interesting 👍🏼
He is crazy but in a good way!
So I guess A^B (for matrices A,B) can not be defined uniquely?
Left power and right power :)
@@drpeyam Yeah, unfortunate :-|
This is why I propose notation A^B=(exp(ln(A)B)) and A ↑B=exp(Bln(A))
@@aneeshsrinivas9088 The use of arrow up already has a signification.
Arrow up repeated exponentiation.
X AU 3 = x^x^x
@@poutineausyropderable7108 thats a double up arrow, not a single up arrow, the single up arrow is the same thing as exponentiation
This reminds me of Kalman filters ... if there is any interest, perhaps see if this might apply somehow to moving-target tracking. Cheers.
Sweet.
très surprenant. merci.
De rien!!
Matrices as exponents are in fact useful in machine learning.
Well, one possible evolution of neural network might be a convolution, somehow, of exponentiation of matrices (i.e., connections between layers), so ,.... it might be VERY useful :D
Me not knowing ANYTHING about a mathematical matrix and still watching:
_Interesting_
Hurting our heads so early in the Holiday season.
I think I've seen this type of linear algebra used in Kalman filtering, but I'm not an expert on it. Neat vid though
Ooooh interesting!!
Couldn't we compute the logarithm of a matrix A=RDR^-1 as log(A)=log(RDR^-1)=log(R)+log(D)+log(R^-1)=log(D) ? I know that this would probably hold only if the matrices commuted, but it could be nice.
Sadly logs don’t operate this way for matrices, in fact we don’t even have identities like exp(A+B) = exp(A) exp(B) for matrices
@@drpeyam Sadface
I don’t understand such high level of math…but I fcking loved this. Instant sub
Thank youuuu
the best wawy to make a school exam, is suddenly give this task on teh exam and see, if students have real clues in math or not :D
Trueeee
@@drpeyam wow look, can you please make a video about fast way to solve geometrical summ of matrix like 1 + M +M^2+... = 1/(1-M)
There’s a video on the geometric series
@@drpeyam thanks, i ll check that. Also maybe you have some advice to the efficient way of solving the Leontiev Matrix -
the linenar equasion in form X(I-A)=Y, with solution X= Y(I-A)^-1, x,y are vectors and A is very big matrix
Can we somehow decompose a 3×3 matrix into several 2×2 matrix such that the operation is unique and an inverse decompose yields the same 3×3 matrix?
writing at 1:39, so it's egg on my face if this gets addressed, but don't we need to be a bit more careful about saying
exp(X)^Y = exp(XY)
when X and Y are matrices? i thought there were some conditions about commutation that had to be satisfied nefore saying that
That’s a good point
" this is math , we can do whatever we do " this statement is mathematically false 😁❤ .... salute to you ❤
It remains to understand how get matrix derivative of a matrix
Hi Dr. P. Where can I read more about this? Could you pls help me
Check out the playlist
@@drpeyam Thanks you for the prompt reply sir. specifically I am asking if there is any published book or something. I have studied matrix functions which are extended from real valued functions but I have never seen such thing.
what is eigenvalue ?
How about tensor root of a tensor
Omg
i didnt thought there is a way to do matrix root as well
I'm sorry, we're revoking your math license.
Great stuff. But how does one cause a matrixth root or a matrixth power of a matrix?
That's great but could eigen value be zero and can you apply such diagonalization for any square matrix? I mean, we have charateristic polynoms for eigenvales and last ones have complexity roots sometimes.
It’s fine, ln(0+) = - infinity and if you exponentiate that you get 0. And ln(-1) is complex so also ok
I'd say the best way to find applications for this kind of math is to model it in a simulation.