We won't get a teacher like him possibly in our life time. 100 years fast forward people would be amazed to listen to his lectures and would marvel at the quality of education 100 years ago.
I have an exam in like.. 5 hours and am exhausted, yet, as soon as I heard the joke about Clinton and Monica, I spat out my morning coffee and laughed so hard my entire dorm woke up. Thank you so much for not only teaching me everything I need to know, but also making it fun and making it stick!
You know, at first you start a new course, and it's kinda cool and chill, and you like it. And then, you go like *WAIT I BLINKED WWW..... WTF IS THIS THING???* *LINEAR ALGEBRA INTENSIFIES*
Exam prep. lecture: At 1:00, he cover subspaces of 3x3 matrices, call it M. At 3:00, basis for M. At t = 15:00, basis for eigenspaces. At 24:00, he covers rank 4 matrices. At 29:00, feasible exam problem. At 39:00, graphs.
Watched these lectures in late 2019 before graduating. Now I am here 3 years later to freshen up my knowledge and to put all small building blocks in order. Precious material all of that.
30:00 R4 is being mapped to R1 (number of rows) so there are 3 (independent columns) dimensions collapsing to 0. They give us the 3 independent variables.
This is a masterpiece by DR. Gilbert Strang of MIT. In control engineering, which is part of electrical engineering linear algebra is a must. All students in this field of study, graduate and undergraduate must know linear algebra.
regarding the mono audio issues, if you are on W10 you can search ''ease of access'' -> ''audio'' and turn on ''mono audio''. it will work properly that way. (remember to turn it off afterwards)
This spans all my Spaces. Thanks to @lexfridman for pointing me in this direction again 30+ years after first taking and passing the course but not understanding anything.
As a mathematician I'm wondering why it's not mentioned at all, that the "nullspace" is also called "kernel", ker(A) = nullspace(A), where A is a matrix. The column space of a matrix A on the other hand is also called "Image" of A. Good lecture style. Thanks.
Probably because at this point the emphasis isn't on viewing matrices as linear transformations, but as objects of special interest in their own right.
Reply to a question below on why we care about the 4 fundamental subspaces in the study of Ax=b: In professor Strang words (not in this lecture) "understanding the four fundamental subspaces elevates the understanding of Ax=b to a vector space point of view, specifically, to to a vector subspace level of understanding" It will pay off when learning different types of matrix factorizations, for ex, Single Value decomposition, SVD, where A=U∑V^T. Where U is composed of orthonormal vectors in the row space, V of orthonormal vectors in the Column Space, ∑ by the diagonal matrix of the squares of the singular values. Similarly, all other matrix factorizations QR, Q(LAMBDA)Q^T, etc benefit from their description in terms of subspaces. This understanding/insight is super powerful, even critical, in Optimization,Machine Learning, ill posed problems, etc Plz don't disregard this Vector Subspace approach, it's only apparently that it doesn't seem to shine when being learnt for the first time, but believe me, it's pure gold, and the clarity with which Professor Strang explains it is invaluable ...he said it himself, it bends your mind when extended to sets of matrices, and to sets of functions (the differential equation example, a prelude to Fast Fourier Transform) but the dividends you gain are huge
At 14:20 the dimension formula is more intuitive if stated as dim(S) + dim(U) - dim(S∩U) = dim(S+U) i.e. " the dimension of S added to the dimension of U minus the dimention of the interscetion of S and U is the dimension of S+U" ... so much easier to visualize stated that way n'est-ce pas?
When professor Stang says that A = uV^T, I suspect that u is the pivot column and V^T is the pivot row. Am I right? But nonetheless, the lecture is fantastic as usual and these truly are a gift. Thanks a lot professor Strang and MIT.
Yes for a rank 1 matrix it's pivot column and pivot row but I think when you write a rank 4 matrix as a sum of rank 1 matrices than that rank 1 matrics does not come from the pivot rows and pivot columns of the rank 4 matrix.
The best joke of all occurs at 19:00. “Five minutes of 18.06 is enough to take care of 18.03.” And it’s not a joke, provided you have a thorough understanding of 18.06.
I believe it is because you can write -v1 = v2 + v3 + v4 therefore they are not linear independent. This describe the null space of a matrix with v1 as pivot and the other three vectors as free variables
v1 can always be described in terms of the other, so it's dependent on those three. However, the rest three are independent, and thus, can span a space. Hence, thr number of dimensions of the null space is three.
The subspace S is actually the nullspace of A=[1 1 1 1]. Therefore the dimension of S is the dimension of N(A). dim N(A)=n-r, and the rank of A is obviously 1 since it has only one row, therefore N(A)=3. You can also solve the equation Av=0. Since A is already in its rref form [I F], the special solutions to Av=0 are column vectors of [-F I]^T, which is [-1 1 0 0; -1 0 1 0; -1 0 0 1]^T.
Think about it this way: Once you have values for 3 variables, the fourth is not "free". In a way, the fourth variable is subject to what the other variables' values are. So, there are only 3 independent variables. It doesn't matter if they equal 0, 1, e, pi, or 42. In a way, the equation itself gives you some information and makes one of the variables redundant. In some contexts, you'll have this concept of "degrees of freedom". This is what it means.
Why this got only 300k views = 30k views per year from all over the world. There are millions of engineering, math and science students that need linear algebra each year.
At 31:44 in order to find the dimension of subspace S, Prof Strang takes A as [1 1 1 1] of which S is null space. My question is, we could have taken A as 4 * 4 identity matrix for which S would have acted as null space. In that case rank(A) is 4 and hence no free variable thus its null space which is S has (4-4) 0 dimension So why are we restricted to take A as [1 1 1 1] in which case the dimension of S comes out to be 3 as rank(A) = 1 ?
Not I guess, Identity matrix * column vector(v1,v2,v3,v4) = column vector(v1,v2,v3,v4) but not zero column vector, hence, taking identity matrix as A doesn't work.
The distance between Monica and Clinton is actually two nodes connected by zero edges. And the linear equations between hillary and monica is beyond the realm of mathematics.
Union represents a matrix thats in either S or U whereas their sum can give a matrix thats not in either of them. You can check it by a simple example.
Dont know if its an error , maybe im wrong but at 14:00 prof strang says that dim( S intersect U ) + dim( S + U ) = 3 + 6 = 9 , but he prior to that he said dim( S + U ) = 9 so I got confused by this because now we have that 3 + 9 = 3 + 6
At 17:35, Shouldn't either of cos(x) or sin(x) should be the basis because either of sin or cos can be generated from the other. (like cos(x) = sin(π/2 - x)
if we have only two here like cos(x) and sin(x) you should be thinking of one being the multiple of other, if x is being used too then you are right , but we went for the general idea of one being some multiple of other without introducing x. which is not true without knowing the x
For the subspace in r4 with all the components of a vector adding up to zero, how is that a subspace? What if I had the vector with components [-4,-6,7,3], those add up to zero but if I were to add up any member than this wouldn't be a subspace.
If you have two vectors such that the components of each vector add up to zero, then the components of the sum also add up to zero. For example, take [-4,-6,7,3] and [1,0,-1,0]. In both cases, components add up to zero. The sum of the two vectors is [-3,-6,6,3], the components of which also add up to zero.
I have a doubt at 12:47 if I have a matrix in M which is [0,0,0; 0,0,0; 1,0,0] which is neither a part of Symmetric matrix or upper triangular matrix, then how come on doing S+U I get all 3x3 matrices?
Its the sum- if you wanted to get the matrix you describe you could have (using simple basis) S=[0,0,1;0,0,0;1,0,0] and U=[0,0,1;0,0,0;0,0,0] and get M = S + -U
Can anyone please explain how the dimension of symmetric matrix is 6? (at 6:43). I understood we have have 3 diagonal basis, but I do not understand how we get the other 3... [100;000;000], [000;010;000], [000;000;001] is all I can think of now... how the other basis can be a symmetric?
well, in a symmetric 3x3 matrix there are 6 elements you can choose independently. You can think of it as choosing the three in the diagonal and three in the remaining upper triangle (meaning the upper triangle without the diagonal), because the 3 in the lower triangle then have to match the ones in the upper triangle (meaning the upper triangle without the diagonal). there are many sets of 6 matrices that form a basis for this space, but the most simple set would be: [100;000;000], [000;010;000], [000;000;001] (the ones you mentioned, defining the diagonal) [010;100;000], [001;000;100], [000;001;010] (filling the remaining upper triangle and the lower triangle in a symmetrical manner)
I'm watching this with airpods and i was wondering why his voice pans in my right ear until i realised his recorder is on his right hip. Still, great lectures!
Thank you very much!! I like to clarify. 3 by 3 matrix has 9 components which are a11, a12, a13, a21, a22, a23, a31, a32, a33. Do you meanthat a12, a13, a21, a23, a31, a32 have 1 respectively?
In a symmetric matrix you have six independent values, 3 on the diagonal: a11, a22, a33, and 3 that have reflections: a12=a21, a13=a31, a23=a32. In upper triangular it is six because there are 6 values that can vary and the lower 3 are fixed to zero.
It's 6+6 = 3+9 13:40 Though I think it makes more sense as 6+6-3 = 9 the sum of dimensions of the subspaces minus the dimension of their intersection is the dimension of their sum dim(s1) + dim(s2) - dim(s1∩s2) = dim(s1+s2) my intuition: the added dimensions of the subspaces minus the part you counted double, because it's in both subspaces, is the dimension of the sum of the subspaces.
Now all the learners of this course know what happened about Monica LOL. I think at the end of 2010s many of unis students do not know these 1990s things.
Watching this in 2019. Revising for Machine Learning. That guy is brilliant
same here
Yep, same
cemhier
same here
same here!
Brilliant teaching, a gift to the world. Thank you prof. Strang and OCW
Thank you kindly
We won't get a teacher like him possibly in our life time. 100 years fast forward people would be amazed to listen to his lectures and would marvel at the quality of education 100 years ago.
I have an exam in like.. 5 hours and am exhausted, yet, as soon as I heard the joke about Clinton and Monica, I spat out my morning coffee and laughed so hard my entire dorm woke up. Thank you so much for not only teaching me everything I need to know, but also making it fun and making it stick!
You know, at first you start a new course, and it's kinda cool and chill, and you like it. And then, you go like *WAIT I BLINKED WWW..... WTF IS THIS THING???*
*LINEAR ALGEBRA INTENSIFIES*
My thoughts exactly
Lol... once he started about 3x3 matrix spaces....
i was lost for a while oops
Thats when its nice to be able to rewind :)
Exam prep. lecture:
At 1:00, he cover subspaces of 3x3 matrices, call it M.
At 3:00, basis for M. At t = 15:00, basis for eigenspaces.
At 24:00, he covers rank 4 matrices. At 29:00, feasible exam problem.
At 39:00, graphs.
at 42:50, he drops a sick Monica Lewinsky joke
Oh, and also at 40:45, he invents LinkedIn
He's the Mr. Rogers of linear algebra.
I can't explain in the words that how much your lectures have changed the way I study. I hope to see you one day and thank personally.
Excellent teacher. The little touches like reinforcing important concepts as he's moving on to new ones that make use of them is very helpful.
Lies again? Marine Soldier
Watched this lecture over and over, and every word starts making sense. Great for background knowledge of ML. Love this man so much!
Watched these lectures in late 2019 before graduating. Now I am here 3 years later to freshen up my knowledge and to put all small building blocks in order. Precious material all of that.
I love this man the way he teach and connects the subject with reality. .. Last 5 min... 💚
I've been in love with Math my whole life. This course is so satisfying to me.
Love the Gilbert Strang ability to make linear algebra very illustrative and theorems so simple. One more time, thank you! )
Do not get too excited, he is not wearing a different shirt, he is wearing a coat, you can see the over-loved shirt under it.
The joke at the end was likely very effective in making students interested and curious about the topic. What a genius!
Worth watching. My distance to Clinton is now three :D
Lol
Then by watching a video of Bill Clinton your distance immediately drops to one
@@The5thBeatle2010 u must be fun at the parties
@@The5thBeatle2010 Very good counter-example!
gilbert strang: my distance to Clinton is 2.
random physics-student: well, that doesn't mean anything to me.
I recommend watching the final video in the 3B1B “essence of linear algebra” series before you watch this one
oh I can't stop this class, I watch it all night.
30:00 R4 is being mapped to R1 (number of rows) so there are 3 (independent columns) dimensions collapsing to 0. They give us the 3 independent variables.
This is a masterpiece by DR. Gilbert Strang of MIT. In control engineering, which is part of electrical engineering linear algebra is a must. All students in this field of study, graduate and undergraduate must know linear algebra.
Thanks professor Strang, you are such a lovely guy and a gift to the world, take care
regarding the mono audio issues, if you are on W10 you can search ''ease of access'' -> ''audio'' and turn on ''mono audio''.
it will work properly that way. (remember to turn it off afterwards)
ty
My right ear enjoyed this
It's meant to be listened to "right".
insert left nullspace joke here
on 31:09, the dimension of that subspace is 3, because there are 3 independent vectors: [-1 -1 1 1]', [-1 1 -1 1]', [-1 1 1 -1]'
23:26 pivots column multiply by one row
I did not have 1 degree of separation from that woman.
Oh no he didn't alright. He had 0 degrees of separation.
hahah nice
calm down jamal
This spans all my Spaces. Thanks to @lexfridman for pointing me in this direction again 30+ years after first taking and passing the course but not understanding anything.
i never go to school thanks to and get to have way better lecture thanks to mit and the professors there.
Thank you so much MIT and Professor Strang!
As a mathematician I'm wondering why it's not mentioned at all, that the "nullspace" is also called "kernel", ker(A) = nullspace(A), where A is a matrix. The column space of a matrix A on the other hand is also called "Image" of A.
Good lecture style. Thanks.
Probably because at this point the emphasis isn't on viewing matrices as linear transformations, but as objects of special interest in their own right.
I like Professor Strang's sweater. Reminds me of Mr. Rogers' Neighborhood.
People strolling in late to one of the greatest linear algebra lectures ever given.
That differential equation analogy was impeccable
This guy really is a good teacher.
Reply to a question below on why we care about the 4 fundamental subspaces in the study of Ax=b:
In professor Strang words (not in this lecture) "understanding the four fundamental subspaces elevates the understanding of Ax=b to a vector space point of view, specifically, to to a vector subspace level of understanding"
It will pay off when learning different types of matrix factorizations, for ex, Single Value decomposition, SVD, where A=U∑V^T. Where U is composed of orthonormal vectors in the row space, V of orthonormal vectors in the Column Space, ∑ by the diagonal matrix of the squares of the singular values.
Similarly, all other matrix factorizations QR, Q(LAMBDA)Q^T, etc benefit from their description in terms of subspaces.
This understanding/insight is super powerful, even critical, in Optimization,Machine Learning, ill posed problems, etc
Plz don't disregard this Vector Subspace approach, it's only apparently that it doesn't seem to shine when being learnt for the first time, but believe me, it's pure gold, and the clarity with which Professor Strang explains it is invaluable
...he said it himself, it bends your mind when extended to sets of matrices, and to sets of functions (the differential equation example, a prelude to Fast Fourier Transform) but the dividends you gain are huge
And watch the next video lecture
Watching this in Denmark for exam preb. Could just have seen this course first, an amazing job. Thank you very much!
Thanks professor strang. Your lectures are amazing.
Prof Strang can be quite the comedian - that comment about Lewinsky was hilarious.
great professor ... amazing teaching method and he clearly has developed an astonishing intuition over the concepts, Bit Thumbs up from me!
This lecture is the hardest one in 18.06.
What a relief that is to hear!
Yo Mit thank Gilbert on my behalf hes the best teacher i have gotten to learn from thus far 😊
watching this in 2016. What's hillary's distance to Monica. LOL
LOL. It's funny.( ̄▽ ̄)"
Who is Monica ?
en.wikipedia.org/wiki/Monica_Lewinsky
ty
monica is like a mexican who does the job that americans feel to good for
My Monica distance is too small 😂😭 Professor Strang comin’ at us with all that humor towards the end was fun 🧡🙌🏽✨
My right ear learned a lot from this lecture.
Thank you for making rank 1 matrices so clear!
At 14:20 the dimension formula is more intuitive if stated as dim(S) + dim(U) - dim(S∩U) = dim(S+U) i.e. " the dimension of S added to the dimension of U minus the dimention of the interscetion of S and U is the dimension of S+U" ... so much easier to visualize stated that way n'est-ce pas?
Shreyas e^(-ix) = (c1) cos(x) + (c2) sin(x) when c1 = 1 and c2 = -i
thanks, can you also describe how this second order differential equation can be written out as Ay=0 while solving for y?
Thank goodness for browser extensions. Setting this video to Mono makes it much easier to listen to. :D
Is it me or this is more addictive than GOT
It’s definetely you
was the great seeing the humor come alive in this one.
When professor Stang says that A = uV^T, I suspect that u is the pivot column and V^T is the pivot row. Am I right? But nonetheless, the lecture is fantastic as usual and these truly are a gift. Thanks a lot professor Strang and MIT.
i suppose you're right
Yes for a rank 1 matrix it's pivot column and pivot row but I think when you write a rank 4 matrix as a sum of rank 1 matrices than that rank 1 matrics does not come from the pivot rows and pivot columns of the rank 4 matrix.
The best joke of all occurs at 19:00.
“Five minutes of 18.06 is enough to take care of 18.03.”
And it’s not a joke, provided you have a thorough understanding of 18.06.
Could somebody please explain to me what fast way they used to find the dimension of the subspace S (all v's in R^4 with v1+v2+v3+v4=0)? in 30:20
I believe it is because you can write -v1 = v2 + v3 + v4 therefore they are not linear independent.
This describe the null space of a matrix with v1 as pivot and the other three vectors as free variables
v1 can always be described in terms of the other, so it's dependent on those three. However, the rest three are independent, and thus, can span a space. Hence, thr number of dimensions of the null space is three.
The subspace S is actually the nullspace of A=[1 1 1 1]. Therefore the dimension of S is the dimension of N(A). dim N(A)=n-r, and the rank of A is obviously 1 since it has only one row, therefore N(A)=3.
You can also solve the equation Av=0. Since A is already in its rref form [I F], the special solutions to Av=0 are column vectors of [-F I]^T, which is [-1 1 0 0; -1 0 1 0; -1 0 0 1]^T.
Think about it this way: Once you have values for 3 variables, the fourth is not "free". In a way, the fourth variable is subject to what the other variables' values are. So, there are only 3 independent variables. It doesn't matter if they equal 0, 1, e, pi, or 42. In a way, the equation itself gives you some information and makes one of the variables redundant.
In some contexts, you'll have this concept of "degrees of freedom". This is what it means.
I have the same problem understanding this part, and thanks for the question you ask and the discussion here!
Why this got only 300k views = 30k views per year from all over the world. There are millions of engineering, math and science students that need linear algebra each year.
Wish we all had a professor like him
If the rank is only one, it can't get away from us.
He's so sweet, funny and a bigger teacher
For nearly three quarters of this lecture I was totally lost about each single word of it.
@glyn hodges hey man, could you please tell me how the dimension is 6 for symmetric and upper triangular. Not getting it
@glyn hodges yep I see that now. Thanks a lot mate. It's quite interesting but quite complex at the same time
At 19:03, 5 minutes of 18.06 is enough to take care of 18.03. That's mathematics prof trash talk for ya.
At 31:44
in order to find the dimension of subspace S, Prof Strang takes A as [1 1 1 1] of which S is null space.
My question is, we could have taken A as 4 * 4 identity matrix for which S would have acted as null space.
In that case rank(A) is 4 and hence no free variable thus its null space which is S has (4-4) 0 dimension
So why are we restricted to take A as [1 1 1 1] in which case the dimension of S comes out to be 3 as rank(A) = 1 ?
Not I guess,
Identity matrix * column vector(v1,v2,v3,v4) = column vector(v1,v2,v3,v4) but not zero column vector, hence, taking identity matrix as A doesn't work.
You've had a new shirt professor :D.
The distance between Monica and Clinton is actually two nodes connected by zero edges. And the linear equations between hillary and monica is beyond the realm of mathematics.
This guy is the Mr. Rogers of linear algebra.
facebook was built in 2004, this was recorded in 2005....i wonder if he knew this biggest application of graphs.....
Completely confused by 10:57. Can somebody explain what's the difference between union and his sum "+"?
Union represents a matrix thats in either S or U whereas their sum can give a matrix thats not in either of them. You can check it by a simple example.
@@vijaybm9305 Got it, it's actually _really_ summing up all S and U combinations. Thanks
Listen with your headphones at the beginning, it would give you a classroom experience.
After 15 year
In 2020
You are amazing boos......
20 years. The lectures were recorded in 2000
What a great professor
Dont know if its an error , maybe im wrong but at 14:00 prof strang says that dim( S intersect U ) + dim( S + U ) = 3 + 6 = 9 , but he prior to that he said dim( S + U ) = 9 so I got confused by this because now we have that 3 + 9 = 3 + 6
Watching Andrew Ng's Machine Learning course, came here to understand what a low rank matrix factorization is.
this guy is a superman in mathematics
At 17:35, Shouldn't either of cos(x) or sin(x) should be the basis because either of sin or cos can be generated from the other. (like cos(x) = sin(π/2 - x)
if we have only two here like cos(x) and sin(x) you should be thinking of one being the multiple of other, if x is being used too then you are right , but we went for the general idea of one being some multiple of other without introducing x.
which is not true without knowing the x
Monica joke is just awesome!!
For the subspace in r4 with all the components of a vector adding up to zero, how is that a subspace? What if I had the vector with components [-4,-6,7,3], those add up to zero but if I were to add up any member than this wouldn't be a subspace.
If you have two vectors such that the components of each vector add up to zero, then the components of the sum also add up to zero.
For example, take [-4,-6,7,3] and [1,0,-1,0]. In both cases, components add up to zero.
The sum of the two vectors is [-3,-6,6,3], the components of which also add up to zero.
Pratham Bhat Thank you!
I have a doubt at 12:47 if I have a matrix in M which is [0,0,0; 0,0,0; 1,0,0] which is neither a part of Symmetric matrix or upper triangular matrix, then how come on doing S+U I get all 3x3 matrices?
Its the sum- if you wanted to get the matrix you describe you could have (using simple basis) S=[0,0,1;0,0,0;1,0,0] and U=[0,0,1;0,0,0;0,0,0] and get M = S + -U
Recall that the scalars can be negative
My distance to clinton updated to 3 from infinity after this lecture
I Really Like The Video From Your Matrix Spaces; Rank 1; Small World Graphs.
How can be the dimension of a 3*3 new vector space be 9, can anyone explain this to me ???
The space is combs of 3x3 matrices. You can have 9 3x3 matrices, thus 9 basis. Combs of those 9 basis fill up the space with 9 dimension.
@@kevinnejad1072 Thank you very much , now I understand .
I kind of got lost on this video
Same
Well then watch it again, until you "get it". That's the only way.
Why 6 dimensions for symmetric? We have 3 for main diagonal and that's all. Isnt it?
Same question..
@@souviksarkar4585 I really don't understand the dimension for symmetric and upper triangular
what does the matrix space actually means? and how to understand the all R^{3 by 3) matrix?
The jokes near the end: Hilarious.
Why? Why talk about matrix space?
by 'all 3 by 3 matrix', does he mean matrix has 3 rows and 3 column?
lol maybe thats the reason MIT is so prestigious, they have awsome teachers!
That can't be right. So many of those replying in this thread seem so much more clever than Professor Strang. Why isn't MIT hiring them as professors?
God I love the Strang.
Can anyone please explain how the dimension of symmetric matrix is 6? (at 6:43). I understood we have have 3 diagonal basis, but I do not understand how we get the other 3... [100;000;000], [000;010;000], [000;000;001] is all I can think of now... how the other basis can be a symmetric?
a[100;000;000]+b[000;010;000]+c[000;000;001]+d[010;100;000]+e[001;000;100]+f[000;001;010]
Together we have 6 :))
well, in a symmetric 3x3 matrix there are 6 elements you can choose independently. You can think of it as choosing the three in the diagonal and three in the remaining upper triangle (meaning the upper triangle without the diagonal), because the 3 in the lower triangle then have to match the ones in the upper triangle (meaning the upper triangle without the diagonal).
there are many sets of 6 matrices that form a basis for this space, but the most simple set would be:
[100;000;000], [000;010;000], [000;000;001] (the ones you mentioned, defining the diagonal)
[010;100;000], [001;000;100], [000;001;010] (filling the remaining upper triangle and the lower triangle in a symmetrical manner)
my right ear feels wholly educated
What will be matrix A if I want to express y''+y=0 in the form Ay=0 ?
I'm watching this with airpods and i was wondering why his voice pans in my right ear until i realised his recorder is on his right hip. Still, great lectures!
love the ending...
Monica joke :) , what a funny and smart professor
You think you know linear algebra till you watch these lectures 😉
That joke of Monica-Clinton graph won't age lol
06:58 Anybody knows that why the dimension of symmetric matrix and upper triangular matrix are 6?
Thank you very much!! I like to clarify. 3 by 3 matrix has 9 components which are a11, a12, a13, a21, a22, a23, a31, a32, a33. Do you meanthat a12, a13, a21, a23, a31, a32 have 1 respectively?
In a symmetric matrix you have six independent values, 3 on the diagonal: a11, a22, a33, and 3 that have reflections: a12=a21, a13=a31, a23=a32. In upper triangular it is six because there are 6 values that can vary and the lower 3 are fixed to zero.
The rule for the dimensions of Symmetric and Upper Triangulars only apply to 3x3 matrices?
Thank you.......you've shared the worth lesson...
Even a high school student can understand his lectures.
is there some mistakes in 3+9=6? i think dimensions can't be added directly.
It's 6+6 = 3+9
13:40
Though I think it makes more sense as 6+6-3 = 9
the sum of dimensions of the subspaces minus the dimension of their intersection is the dimension of their sum
dim(s1) + dim(s2) - dim(s1∩s2) = dim(s1+s2)
my intuition:
the added dimensions of the subspaces minus the part you counted double, because it's in both subspaces, is the dimension of the sum of the subspaces.
Now all the learners of this course know what happened about Monica LOL. I think at the end of 2010s many of unis students do not know these 1990s things.