Thank you professor! I am amazed by the fact that professors from top institutes like MIT explain the mere basics without any expectation that we are supposed to know those topics earlier. On the other side our university professors just avoid the whole thing by saying "it isn't the part of syllabus, you are expected know this already". A huge salut and thanks to professor Strang and MIT team for publishing these videos free of cost.
Yes this one line is fix-"you are supposed to know this, learn it at your own" And here this great professor giving knowledge from basic level to very advance level.
Gilbert Strang, you are truly an outstanding teacher! I am currently doing my Master's thesis in Finite Element Analysis and started watching these video lectures just for fun, since I already had some Linear Algebra back on my bachelor's. Your little sidenote at the end of a lecture about multiplying a system by A.transpose actually helped me crack a problem I'm dealing with right now. My finite element system had more equations than unknowns (because I'm fixing some internal degrees of freedom, not the nodes themselves) and I just couldn't figure out how to solve such system. I completely forgot about this trick of multiplying by a transpose!! THANK YOU SO MUCH!! My final system now has "good" dimensions and the stiffness matrix has a full rank!!!
And also his strict mathematical proof, I believe, in 1971, about the completeness being a necessary condition for FEM convergence is actually something I'm using right now! This guy played such a great role in FEM.
This lecture is a Tour of Force, every sentence he says, including the ancillary comments, are so well crafted that makes everything click with ease. Least Squares open the gates for the siamese fields of Optimization and Inverse Theory, so every bit of insight he shares has deep implications on those fields (and many others). It's not exaggeration to say that the whole lecture is an aha! moment. Very illuminating, thank you Professor Strang
I really encourage you to buy The Introduction of Linear Algebra which Pf Strang wrote. If I say these videos are rank r, then I can definitely say the book is the orthogonal complement of these videos that makes perfect dimension of Linear Algebra.
00:00:00 to 00:02:50 : Introduction 00:02:51 to 00:13:45 : What is Orthogonality? 00:13:50 to 20:49:00 : What is Orthogonality for Subspaces? 00:20:50 to 26:00:00 : Why RS(A) ⊥ NS(A)? 26:01:00 to 34:00:00 : What is Orthogonal complement? 39:45:00 to End : Properties of A^T.A ?
In many linear algebra courses that I have seen, the student is simply told about the various relationships between the fundamental subspaces. But in this course these ideas are convincingly yet accessibly presented. This is very important because it allows students to really understand such key ideas of linear algebra to the point where they become intuitive, instead of simply memorizing properties and formulas. Another great lecture by professor Strang!
The dot product of orthogonal vectors equals zero. All of a sudden it clicked when I remembered my conclusion as to what a dot product actually was, that is, "what amount of one vector goes in the direction of another." Basically, if vectors are orthogonal, then no amount of one will go in the direction of the other. Like how a tree casts no shadow at noon.
so much fun, so much love. Thank you Professor Strang and MIT for inspiring more people around the world. I truly enjoy learning linear algebra with Professor Strang :) we know he's done it!
I love his explanations. My linear Algebra prof. will just give us definitions, state theorums, and prove them and if were lucky we'll get an example, but never a solid explanation.
Omg the orthogonality between nullspace and row space adds up so well with the v=[1,2,3] example prof. gave previous lecture. I've seen much less entertaining tv series than 18.06, this course should be on Netflix lol
Thank you very much, sir. I am watching lectures and enjoying them. I have benefited from you because we do not have a teacher in Yemen because of the war situation, so you became my teacher
At 36:22, it is fascinating how he got a bit into the orbital mechanics, saying there are 6 unknowns (which is rightly known at the State vector, which is a 6 by 1 matrix with Position (x,y,z) and velocities (xdot, ydot, zdot)).
At 19:50 he said "When is a line through the origin orthogonal to whole plane? Never" but I think if we take any line through origin and a plane whose normal vector is parallel to that line then they both will be orthogonal. For example x-axis and y-z plane. Help me out please.
By that he means any line passing through the origin that is in the plane(i.e a subspace of the plane) cannot be orthogonal to the whole plane. Of course if this line is parallel to the normal of the plane as you stated, then yes it will be orthogonal to every vector in that plane.
I think its a little mistake that prof. Strang didn’t notice. Probably because prof Strang just taught what property of two vectors are orthogonal have.(ie XtY=0) But this require’s X and Y are column vectors. Here row vector is not a column vector, so no need to transpose in order to product another column vector. Simply row vector * x (which is a column vector) =0 is ok though. Nevertheless, I really like prof Strang’s style. Thank you prof Strang.
Here we contemplate rows as vectors. And it is just a convention to write a vector vertical. So you have to transpose it if you mean to write it down horizontally.
RetroAdvance Thanks for this confirmation - I suspected this reason was much more likely than Prof. Strang making a mistake here (as I have seen this convention in other textbooks). However, it's still confusing as he sometimes draws an array of rows [row_1 row_2 ... row_m] vertically implying that those are horizontal rows. Is this convention of all variables as vectors typically only apply to variables written in text?
Ken Feng Yes, he often writes things down without a strict mathematical rigor for didactic reasons. [row1 row2 row3] is probably just Strang's intuitive formulation to get the point across so don't take that too seriously. As long as it is clear what he means by that it is ok. But a vector is different from its transposed version in terms of its matrix representation: V (element of R^n) is a n x 1 matrix V transposed is a 1 x n matrix
At 25:00 Mr. Stang wrote (row 1) transpose x equals 0, but I don't really understand. I was thinking about to remove the "transpose" thing, and I was sooo confused.
@j4ckjs Thanks for the reply. It's just a confusing definition and if you google a bit on orthogonality, you find many other definitions contradicting this one. I think he should have pointed it out more clearly, but at least now I will be cautious when it comes to this topic. I guess that's good enough...
at 25:17 when he says (row1)^T * x=0. This is wrong. Row1 is 1xn and x is nx1. Row1*x=0. row1^T is nx1 and you can't multiply a nx1 vector by x another nx1 vector.
He didn't transpose the rows of A but the vectors named 'row-sub-i', which are, as any vector, always written in the column form. In other words, it is a convention that, if we want to to write a vector that corresponds to a row of any matrix A (the rows are not vectors by themselves) we write it as the proper vector which is the corresponding column of the matrix A transpose. This makes our notation consistent. Anytime we write a vector name (e.g. 'a'. 'row'. 'q', 'spectrum', 'x', 'v'...), we can always replace it with some matrix column. So, if we want to multiply another vector or matrix with it from the left, we must first transpose it. And it is not a mere convention! It is an essential property of the matrices: the columns are vectors, not the rows. If we could, at our own leisure, claim whenever we want that rows are also vectors, then the whole concept of a transposed matrix will be corrupted, even the concept of a matrix itself.
@@nenadilic9486 I’m not sure that’s completely correct. I’ve seen him show rows as vectors at times to compare what a row vector looks like compared to the column vectors.
Thank you professor! I am amazed by the fact that professors from top institutes like MIT explain the mere basics without any expectation that we are supposed to know those topics earlier. On the other side our university professors just avoid the whole thing by saying "it isn't the part of syllabus, you are expected know this already". A huge salut and thanks to professor Strang and MIT team for publishing these videos free of cost.
Yes this one line is fix-"you are supposed to know this, learn it at your own"
And here this great professor giving knowledge from basic level to very advance level.
this is what make good institutes and good professors
true
That smile at the end, he knew he'd done a good job
Along with that slick chalk trick
awesome smile 😊
He's happy :)
i dont know how but when ever i need he repeats it.thx mr Strang
hey thats true mate. so did you watch the whole series ?
sen de berbat hocalara sahiptin herhalde kader arkadaşım.
@@9888565407 lol let's hope he has the same TH-cam account he had 8 years ago
25:56 "I'm a happier person now." I love his interludes. Thank you, professor, a lot.
Gilbert Strang, you are truly an outstanding teacher! I am currently doing my Master's thesis in Finite Element Analysis and started watching these video lectures just for fun, since I already had some Linear Algebra back on my bachelor's. Your little sidenote at the end of a lecture about multiplying a system by A.transpose actually helped me crack a problem I'm dealing with right now. My finite element system had more equations than unknowns (because I'm fixing some internal degrees of freedom, not the nodes themselves) and I just couldn't figure out how to solve such system. I completely forgot about this trick of multiplying by a transpose!! THANK YOU SO MUCH!! My final system now has "good" dimensions and the stiffness matrix has a full rank!!!
And also his strict mathematical proof, I believe, in 1971, about the completeness being a necessary condition for FEM convergence is actually something I'm using right now! This guy played such a great role in FEM.
This lecture is a Tour of Force, every sentence he says, including the ancillary comments, are so well crafted that makes everything click with ease. Least Squares open the gates for the siamese fields of Optimization and Inverse Theory, so every bit of insight he shares has deep implications on those fields (and many others). It's not exaggeration to say that the whole lecture is an aha! moment. Very illuminating, thank you Professor Strang
To find this course on the web is tantamount to finding massive gold treasure.
I really encourage you to buy The Introduction of Linear Algebra which Pf Strang wrote. If I say these videos are rank r, then I can definitely say the book is the orthogonal complement of these videos that makes perfect dimension of Linear Algebra.
Dude I read this comment and literally TODAY I recommended this book to a guy in a Facebook group and he already ordered it. 👌
Mr Strang makes me feel, in the first time of my life, that linear algebra is interesting!
Hi
00:00:00 to 00:02:50 : Introduction
00:02:51 to 00:13:45 : What is Orthogonality?
00:13:50 to 20:49:00 : What is Orthogonality for Subspaces?
00:20:50 to 26:00:00 : Why RS(A) ⊥ NS(A)?
26:01:00 to 34:00:00 : What is Orthogonal complement?
39:45:00 to End : Properties of A^T.A ?
In many linear algebra courses that I have seen, the student is simply told about the various relationships between the fundamental subspaces. But in this course these ideas are convincingly yet accessibly presented. This is very important because it allows students to really understand such key ideas of linear algebra to the point where they become intuitive, instead of simply memorizing properties and formulas. Another great lecture by professor Strang!
This series is phenomenal. Every lecture a gem. Thank you Mr Strang!
The dot product of orthogonal vectors equals zero. All of a sudden it clicked when I remembered my conclusion as to what a dot product actually was, that is, "what amount of one vector goes in the direction of another." Basically, if vectors are orthogonal, then no amount of one will go in the direction of the other. Like how a tree casts no shadow at noon.
Thank you for this comment! That's a great conclusion.
it was very enlightening
th-cam.com/video/LyGKycYT2v0/w-d-xo.html to get the concept of the dot product.
This helped, a lot
Oh my this is enlightening, I've never thought it that way
"the origin of the world is right here"
okkkkk, cameras are rolling, this is lecture 14. What an intro line!
so much fun, so much love. Thank you Professor Strang and MIT for inspiring more people around the world. I truly enjoy learning linear algebra with Professor Strang :) we know he's done it!
DR. Strang ,thank you for another classic lecture on orthogonal vectors and subspaces. Professor Strang, you are the grand POOBAH of linear algebra.
I remember falling asleep in all my linear algebra classes @ UWaterloo. Not until now that I'm starting to like linear algebra!
"like linear algebra" Good one!
Lisa Leung
Is that in Ontario, Canada
same here for me, I study at EPFL, but Mr. Strang seems to have a natural gift for the subject
I have never felt the platonic injunction to "to carve nature at its joints" more strongly than after watching this lecture.
The most greatest lecturer who I meet in my life.
Really a great lecture. He explains things simply that they seem obvious. I had never learned it as clearly as this in my college.
I am learning so much more from these lectures than from any teacher I have ever had
Linear algebra, it's been almost 3 years but I think I've finally got you. *sob *wished I could go back in time
Man, I think a part of me died in the math class I took at the start of university. I feel like I'm ressurecting a part of my soul.
It is a pure joy watching these lectures. Many thanks to Prof. Gilbert Strang and MIT OCW.
Really enjoying this series! Thank you professor Strang and MIT. This is absolute service to humanity!
I love his explanations. My linear Algebra prof. will just give us definitions, state theorums, and prove them and if were lucky we'll get an example, but never a solid explanation.
Great summary at the end: A^T*A is invertible if and only if A is full column rank! Just loved the lecture...
Omg the orthogonality between nullspace and row space adds up so well with the v=[1,2,3] example prof. gave previous lecture. I've seen much less entertaining tv series than 18.06, this course should be on Netflix lol
I swear, these lectures with the Schaum's Outline of Linear Algebra can really help anyone learn the subject.
Thank you very much, sir. I am watching lectures and enjoying them. I have benefited from you because we do not have a teacher in Yemen because of the war situation, so you became my teacher
"Let me cook up a vector that's orthogonal to it" - the goat professor strang 8:25
This man cooked up some vectors AND insulted MIT's floor integrity. Legend
It's an extraordinary and amazing one.. No other lecturer are as good as Gilbert...
Thank you sir........
You are the king Mr Strang! Thanks
Dr. Strang*
excusez-moi! :)
At 36:22, it is fascinating how he got a bit into the orbital mechanics, saying there are 6 unknowns (which is rightly known at the State vector, which is a 6 by 1 matrix with Position (x,y,z) and velocities (xdot, ydot, zdot)).
"blackboard extends to infinity..." yeah, MIT does have infinitely long blackboard...
* slides out the 45th layer of blackboard *
Second best part about watching these lectures is the comment section
The man puts his Soul into his lectures 🤔🙏🏼😀👍
This is a 90-degree chapter, Strang meant business from the word go!
But lessons are great!!!! I'am enjoying from every class. Thank you Gilbert Strang
You're the best teacher in the world.
Thats A VERY VERY ESSENTIAL lecture for machine learning
i used to do the transpose trick but didnt know where it come from, know i may die in peace
Thanks Prof Strang, MIT!
East or West, Prof Strang is the best!
Cameraman would have become Pro in Linear Algebra by absorbing such a high level of teaching.
38:30 -Mr. Strang gives a hint about the Maximum Likelihood Estimate (MLE).
Recorded in 1999, still relevant in 2021. "Comes back 40 years later" - Yep still relevant
At 19:50 he said "When is a line through the origin orthogonal to whole plane? Never" but I think if we take any line through origin and a plane whose normal vector is parallel to that line then they both will be orthogonal. For example x-axis and y-z plane. Help me out please.
By that he means any line passing through the origin that is in the plane(i.e a subspace of the plane) cannot be orthogonal to the whole plane. Of course if this line is parallel to the normal of the plane as you stated, then yes it will be orthogonal to every vector in that plane.
He’s talking a line through origin (a sub space) that is also in the plane.
Thanks for your question
Thanks for the question and answer... it really helps!
"The one thing about Math is you're supposed to follow the rules."
Gill uses his 'god' voice at ~8:00
He's absolutely amazing!!!
И ја сам одушевљен. Његова предавања су пуна просветљујућих момената, бар за нас лаике је то тако.
Can't wait for the next one!
“Let me add the great name, ‘Pythegorious’!” I love it 😂😂😊
"I shouldn't do this, but I will"
Amazing, thk u MIT and Prof. Gilbert Strang.
cuteness level off the charts @49:35
I can't stop watching the spinning pens 15:05
최고의 선형대수학 강의.
Beautiful lecture and amazing lecturer !!!
Great lecture, Thanks prof. Strang.
8:18 let him cook!!!!
Here I was, thinking I was gonna breeze through this lecture when BAM! I got hit with subtle logic 👨🏽🏫
he is such a great teacher!! thankyou professor strang!!
Brilliant leacture
25:00 why transpose of Row's were taken to find combination of row space? Will we be able to multiply transpose of row 1 to X?
I lost it there too man,idk if you've managed to figure out why
@@rjaph842 wasn't the point proving the case that the left null space and the column space were ortogonal too?
I think its a little mistake that prof. Strang didn’t notice. Probably because prof Strang just taught what property of two vectors are orthogonal have.(ie XtY=0)
But this require’s X and Y are column vectors. Here row vector is not a column vector, so no need to transpose in order to product another column vector. Simply row vector * x (which is a column vector) =0 is ok though. Nevertheless, I really like prof Strang’s style. Thank you prof Strang.
Thank you, sir. You are a great teacher.
This man is my hero 🙌🏽✨
thank you g. strang.
I Really Like The Video From Your Orthogonal Vectors and Subspaces
25:39 Ins´t there an error with the symbol T (for transpose)? Why transpose the rows? Please, explanations!
Here we contemplate rows as vectors. And it is just a convention to write a vector vertical. So you have to transpose it if you mean to write it down horizontally.
RetroAdvance Thanks for this confirmation - I suspected this reason was much more likely than Prof. Strang making a mistake here (as I have seen this convention in other textbooks). However, it's still confusing as he sometimes draws an array of rows [row_1 row_2 ... row_m] vertically implying that those are horizontal rows. Is this convention of all variables as vectors typically only apply to variables written in text?
Ken Feng Yes, he often writes things down without a strict mathematical rigor for didactic reasons. [row1 row2 row3] is probably just Strang's intuitive formulation to get the point across so don't take that too seriously. As long as it is clear what he means by that it is ok.
But a vector is different from its transposed version in terms of its matrix representation:
V (element of R^n) is a n x 1 matrix
V transposed is a 1 x n matrix
Beto ba Yona It is just a small mistake.
@@RetroAdvance no, it is a mistake
41:32 recap point for A.t * Ax = A.t * b
At 25:00 Mr. Stang wrote (row 1) transpose x equals 0, but I don't really understand.
I was thinking about to remove the "transpose" thing, and I was sooo confused.
me too did u ever understand
@@minagobran4165
All row vectors are written with the transpose symbol to indicate they are row vectors and not column vectors.
God bless you profesor!
Thank you prof I'm writing a exam tomorrow morning
“So ninety degrees, this is a ninety-degree chapter.”😘
Loved this lecture
HES THE GOAT. THE GOAAAAAAT
I love this guy!!!
This shit is so good
Really good videos!! This series are helping me pretty much!! I'm from Brazil and I'm loving this videos!!
thank a lot
감사합니다, Thank you.
19:52 A mistake here. A line through origin can be orthogonal to a whole plane.
He was talking about 2D, so the line has to lie in the plane
@@ledkicker2392 You r right, than u a lot.
Was it a burp at 48:48?
now we're asking the real questions
Sorry had an indigestion right before class
He's absolutely amazing! I do wonder, however, if a non MIT - like class of students would be able to appreciate his teaching style.
Teresa Longobardi You're right, those from Princeton stand no chance.
3:13 "going way back to greeks..." :) well, sir, i think greeks are still in the world.
Thank you.
41:32 my determination level
@j4ckjs Thanks for the reply. It's just a confusing definition and if you google a bit on orthogonality, you find many other definitions contradicting this one. I think he should have pointed it out more clearly, but at least now I will be cautious when it comes to this topic. I guess that's good enough...
at 25:17 when he says (row1)^T * x=0. This is wrong. Row1 is 1xn and x is nx1. Row1*x=0. row1^T is nx1 and you can't multiply a nx1 vector by x another nx1 vector.
Row vectors are written as v^T. It's just a convention to distinguish them from column vectors.
Looking at the comment section, I feel pretty dumb for barely feeling any intuition in whatever Prof. Strang said.
thanks professor
Damn, I wish I had Gilbert Strang as a professor... oh wait... I wasn't smart enough to get into MIT's engineering program....lol
At 25:17 is it necessary to transpose the rows of A before multiplying with X ( since the dimensions match already )?
He didn't transpose the rows of A but the vectors named 'row-sub-i', which are, as any vector, always written in the column form.
In other words, it is a convention that, if we want to to write a vector that corresponds to a row of any matrix A (the rows are not vectors by themselves) we write it as the proper vector which is the corresponding column of the matrix A transpose.
This makes our notation consistent. Anytime we write a vector name (e.g. 'a'. 'row'. 'q', 'spectrum', 'x', 'v'...), we can always replace it with some matrix column. So, if we want to multiply another vector or matrix with it from the left, we must first transpose it.
And it is not a mere convention! It is an essential property of the matrices: the columns are vectors, not the rows. If we could, at our own leisure, claim whenever we want that rows are also vectors, then the whole concept of a transposed matrix will be corrupted, even the concept of a matrix itself.
@@nenadilic9486 I’m not sure that’s completely correct. I’ve seen him show rows as vectors at times to compare what a row vector looks like compared to the column vectors.
Now I know why the figure was this way in the 10th lecture.
Where are the next lectures for A^TA ???
But is anyone on Earth better off because of this? No.
44:20 why can't we solve it? We couldn't if there was more unknowns than equations.
Ax=b, then AtAx=Atb. awesome!
Too much suspense! I'm starting the next video now