You could define a "left matrix power" for A^B = e^(B*logA) versus a "right matrix power" for A^B = e^(logA*B), and then these are a special pair if those two things are the same.
Another approach to defining A^B is the set theoretic definition: it is the set of functions from B to A. So 2^3 is the set of functions from 3 = {0,1,2} to 2 = {0,1}, and there are 8 such functions. I don't see how to apply this to matrices off hand, but it may be worth some thought. On a separate note, the expression A log B with (positive definite) matrices A and B shows up in a quantity called the quantum relative entropy. If you seek the trace instead of the final matrix, i.e. seek to define tr(A^B), this may have a unique definition even when A and B don't commute, since tr(AB) = tr(BA) for all A,B.
I think you goofed at 11:24. You forgot the 1/11 when you applied the exp. Diagonal matrix' only non 0 value should have been 1/11x11xlog2, which is log 2. So you take exponent it's just 2 and not 1/11 x exp (2^11). But still a mindblowing video, as always
And then you encounter an arab, who reads right to left and inverses your logic. And then you meet the classical Chinese or Japanese reader who, obviously, decides to read matrices each column top to bottom, but read columns right to left. Why? Because hail mary.
10:05 you just summoned the war between Fortran (column major) and C (row major) developers. Behold; the madness of using BLAS from C, where we have to make use of the fact that C^T = (AB)^T = B^T A^T in order to get anything done.
I read a matrix as in the column form Also (i don't want you to take this as an offense) but sometimes i don't really get what your doing exactly, your like a math magician and i'm like should i ask why or how you did that trick but again i'm like maybe its obvious and i don't want look like a math noob . keep up the good work. Also "F"
I think it's because if you have a matrix of the form PDP-1 and a function f(X) with a taylor series, then when you expand f(PDP-1) as a series, all the Ps and P-1s cancel out each other and you can factor out f(PDP-1) = P f(D) P-1
f for my boi :(, just casually cooks himself up at 12:18, i feel you anyways i wondered why you cant just cancel the elevens (at 11:05 and likewise at 13:15) by reversing the log rule again, i dont like the high numbers it creates at the end and there must be a definite answer since its math;)
3:24 6:14 No need to feel bad, I also use plenty of curse words, when attempting something, I haven't done since the dark ages.🤬😅😆 On a serious note, it's math. I'm not even 30 yet, and I already have a handsome amount of gray hair. While I may be naturally inclined to have gray hair quickly, me doing math certainly accelerated it.
Ln is something that should be discarded straight outta highschool.....it's log in complex analysis and so takes the win for the right way to write logarithm.
i suggest you make an OF acc where you post OnlyMath
Yessss!
As a leaf, I approve of this opinion.
You could define a "left matrix power" for A^B = e^(B*logA) versus a "right matrix power" for A^B = e^(logA*B), and then these are a special pair if those two things are the same.
Another approach to defining A^B is the set theoretic definition: it is the set of functions from B to A. So 2^3 is the set of functions from 3 = {0,1,2} to 2 = {0,1}, and there are 8 such functions. I don't see how to apply this to matrices off hand, but it may be worth some thought. On a separate note, the expression A log B with (positive definite) matrices A and B shows up in a quantity called the quantum relative entropy. If you seek the trace instead of the final matrix, i.e. seek to define tr(A^B), this may have a unique definition even when A and B don't commute, since tr(AB) = tr(BA) for all A,B.
I think you goofed at 11:24. You forgot the 1/11 when you applied the exp. Diagonal matrix' only non 0 value should have been 1/11x11xlog2, which is log 2. So you take exponent it's just 2 and not 1/11 x exp (2^11). But still a mindblowing video, as always
Very nice analysis. thank you
I read a matrix across rows because that is how I will multiply it to the second matrix - simple, consistent reminder how to multiply.
And then you encounter an arab, who reads right to left and inverses your logic. And then you meet the classical Chinese or Japanese reader who, obviously, decides to read matrices each column top to bottom, but read columns right to left.
Why? Because hail mary.
3:45 solving hard calculus and linear algebra questions, struggling with a quadratic equation , professional deformation 😅
10:05 you just summoned the war between Fortran (column major) and C (row major) developers. Behold; the madness of using BLAS from C, where we have to make use of the fact that C^T = (AB)^T = B^T A^T in order to get anything done.
😂😂😂
Caught me off guard with that position joke, take my sub 😂
I'm surprised you have a sub
12:32 F
I read a matrix as in the column form
Also (i don't want you to take this as an offense) but sometimes i don't really get what your doing exactly, your like a math magician and i'm like should i ask why or how you did that trick but again i'm like maybe its obvious and i don't want look like a math noob .
keep up the good work.
Also "F"
Why can you distribute your log function to the diagonal components at 7:10? I thought you had to work with Taylor expansion.
I think it's because if you have a matrix of the form PDP-1 and a function f(X) with a taylor series, then when you expand f(PDP-1) as a series, all the Ps and P-1s cancel out each other and you can factor out f(PDP-1) = P f(D) P-1
f for my boi :(, just casually cooks himself up at 12:18, i feel you
anyways i wondered why you cant just cancel the elevens (at 11:05 and likewise at 13:15) by reversing the log rule again, i dont like the high numbers it creates at the end and there must be a definite answer since its math;)
Sometimes exp(A*log(B))=exp(log(B)*A) only if A*B=B*A
Tertiary matrices?
Potrebbe essere lnA=[ ]ln[I+B]..e poi usare lo sviluppo in serie..i calcoli per fortuna diventano semplici..lnA=ln2[6 -15
-2 5]....
A=2^[6 -15
-2 5]
3:24
6:14
No need to feel bad, I also use plenty of curse words, when attempting something, I haven't done since the dark ages.🤬😅😆
On a serious note, it's math. I'm not even 30 yet, and I already have a handsome amount of gray hair. While I may be naturally inclined to have gray hair quickly, me doing math certainly accelerated it.
I'm 26 and funny thing is I had lots of grey hair in my highschool years.
Do you made this problem on your own ?😅
Yes how else do you think the solution was smooth like butter 😂
Totally a new definition.
I got an ad right as he was talking about removing the filter from his TH-cam videos.
10:03 gotta read it left to right, top to bottom, or you're crazy 😂
Next video is a matrix inside a matrix
Where would this be used? Also I think you need a vacation; things are getting just too weird.
Just for the fun of math my friend 😂
Natrual log is ln not log😢😢😢
Nope
Ln is something that should be discarded straight outta highschool.....it's log in complex analysis and so takes the win for the right way to write logarithm.
F
F
psychopathy version 🤣🤣🤣
Noice ❤
12:43 F
F