I have ADHD, but you managed to captivate me for so long holy shit. Goated video. Im in first year rn and im tryna learn Linear Algebra. The hardest thing to do in life, is to learn something off a textbook, and not even know HOW your gonna be using it. You dont know what information is important, you dont know why somethings like that, and you basically end up stuck. This really helped teach me linear algebra imo. I find it impossible to learn stuff without first knowing the motivation and application of it lol.
Great course. It never ceases to amaze me how many so-called machine learning videos never tell them how much math you need to actually building neural networks or genetic algorithms etc.
00:07 Math for machine learning is essential due to its role in optimization. 02:25 Matrix multiplication is the core operation in linear algebra for manipulating arrays. 06:10 Linear algebra is like regular algebra with addition and multiplication but involves linear transformations of spaces. 08:02 Linear algebra is like programming, with matrices as functions and shapes as data types. 11:49 Matrix multiplication as function composition 13:46 Linear algebra is crucial for machine learning optimization 17:34 Linear algebra is crucial for machine learning due to its ability to make fast calculations and its compatibility with weighted sums. 19:13 Linear functions make reasoning easy with weighted sums 22:28 Linear algebra viewed as programming 24:08 Low rank approximation and refactoring in linear algebra 27:28 Canonical decomposition process for mapping functions and matrices 29:13 Matrix operations and mapping inputs to outputs 32:27 Breaking down matrices into three key pieces 34:14 Singular Value Decomposition in low rank approximations 37:26 JPEG uses low rank approximation for compression and can be useful for foreground-background separation. 39:06 Explore additional resources based on your background and goals.
Amazing video. I don’t have a strong background in maths, so putting things in terms of programming really helped me start to comprehend this subject. Thank you so much.
solid! can see and feel your passionate through the screen bro. Excited to go through this playlist. I just got hired as a junior data scientist but struggle with the math portion of machine learning especially linear algebra and calculus.
This was great, can't wait for more. I love your explanatory style, for me it threads the ideal boundary between too detailed and not detailed enough. Thank you!
I think you have such a new way of presenting these ideas and concepts. This is insight that some people acquire through ages of learning and experience. But I still feel that these ideas need to be expanded upon, and fleshed out more for the average or advanced student. Please consider providing a further in depth series, going into each of LA, calculus, and prob/stats portions of the MATH4ML series.
In 21:36 you say that elements outside the kernell remain outside under linear combination. That is not necessarily true, that is why we work with linear independence.
at 2:00 you say that many types of data can be represented as arrays. Could you elaborate specifically on what types of data can and cannot be represented as arrays?
On Slide 65, you show 4 resources for Linear Algebra...I started with the Essence of Linear Algebra by 3Blue1Brown (really good)...is that good enough, or should I be looking at all four of these or just two of these?
I'm just starting this course, to anyone who has completed it; is it enough for me to get started with the actual machine learning content? Or will I need more math after this course?
This is great overall, especially from a computer science point of view, but there's no space between thoughts, just cut after cut, from one concept to the next. I have to keep rewinding.
Horrible explanation on the SVD not gonna lie. So convoluted what you just say makes a complex problem even more complex when a convoluted concept really doesn't have an easy answer. I can see why you draw the similarities of code factorization but again the idea is not as nuance as that.
I have ADHD, but you managed to captivate me for so long holy shit. Goated video.
Im in first year rn and im tryna learn Linear Algebra.
The hardest thing to do in life, is to learn something off a textbook, and not even know HOW your gonna be using it.
You dont know what information is important, you dont know why somethings like that, and you basically end up stuck.
This really helped teach me linear algebra imo.
I find it impossible to learn stuff without first knowing the motivation and application of it lol.
Who is this guy? It’s the best Linear Algebra in ML I could find! Better than all my professors
Great course. It never ceases to amaze me how many so-called machine learning videos never tell them how much math you need to actually building neural networks or genetic algorithms etc.
You never fail to impress me as an educator. This is such a good refresher. Kudos!
Best introductory video about what Linear Algebra is that I ever found!
Wow, I'm 11 minutes in and this is the best explanation of linear algebra I've ever seen
00:07 Math for machine learning is essential due to its role in optimization.
02:25 Matrix multiplication is the core operation in linear algebra for manipulating arrays.
06:10 Linear algebra is like regular algebra with addition and multiplication but involves linear transformations of spaces.
08:02 Linear algebra is like programming, with matrices as functions and shapes as data types.
11:49 Matrix multiplication as function composition
13:46 Linear algebra is crucial for machine learning optimization
17:34 Linear algebra is crucial for machine learning due to its ability to make fast calculations and its compatibility with weighted sums.
19:13 Linear functions make reasoning easy with weighted sums
22:28 Linear algebra viewed as programming
24:08 Low rank approximation and refactoring in linear algebra
27:28 Canonical decomposition process for mapping functions and matrices
29:13 Matrix operations and mapping inputs to outputs
32:27 Breaking down matrices into three key pieces
34:14 Singular Value Decomposition in low rank approximations
37:26 JPEG uses low rank approximation for compression and can be useful for foreground-background separation.
39:06 Explore additional resources based on your background and goals.
Amazing video. I don’t have a strong background in maths, so putting things in terms of programming really helped me start to comprehend this subject. Thank you so much.
solid! can see and feel your passionate through the screen bro. Excited to go through this playlist. I just got hired as a junior data scientist but struggle with the math portion of machine learning especially linear algebra and calculus.
This was great, can't wait for more. I love your explanatory style, for me it threads the ideal boundary between too detailed and not detailed enough. Thank you!
Thanks Sergey! That's exactly the boundary I try to walk, so it's really gratifying to hear that I did it right.
Awesome! That's the first time that I actually get the logic of using matrices in the ML. Keep up the good work!
Great video on highlighting the importance of LA in the field of ML 👏
always good to refresh my linear algebra!!
I think you have such a new way of presenting these ideas and concepts. This is insight that some people acquire through ages of learning and experience. But I still feel that these ideas need to be expanded upon, and fleshed out more for the average or advanced student. Please consider providing a further in depth series, going into each of LA, calculus, and prob/stats portions of the MATH4ML series.
Charles impatient to let you know: you can get this too. Pure magic.
Hmm lots of assumptions on prior knowledge. Would be good to spell out prerequisite knowledge necessary to understand. Thanks, good video.
In 21:36 you say that elements outside the kernell remain outside under linear combination. That is not necessarily true, that is why we work with linear independence.
Thank you a lot for this math playlist
Perfect lecture, compherensive explanation... I fall in love with W&B 💖💖
at 2:00 you say that many types of data can be represented as arrays. Could you elaborate specifically on what types of data can and cannot be represented as arrays?
On Slide 65, you show 4 resources for Linear Algebra...I started with the Essence of Linear Algebra by 3Blue1Brown (really good)...is that good enough, or should I be looking at all four of these or just two of these?
Brilliant explanation, very nice interpretation of matrix multiplication as a form of function composition.
I'm just starting this course, to anyone who has completed it; is it enough for me to get started with the actual machine learning content? Or will I need more math after this course?
This is great overall, especially from a computer science point of view, but there's no space between thoughts, just cut after cut, from one concept to the next. I have to keep rewinding.
The descriptions of matrices A, C matrices are very unclear. Hope you can add some examples.
this channel is epic
Thank you, this was very well explained!
12:09 the matrix X can be named transformation_matrix ?
5:30 ur right and i love it
Instant Subscribe.
Is it "optimisation by programming" or "programming by optimisation"?
Very helpful insight. Thanks 👍
Frye can you please state prerequisites for this series. I am starting my journey in machine learning
Hello! I think basic knowledge of math and Python should be enough.
@@WeightsBiases ok, thanks
so exited!
Good video! Really insightful
Loved this
Fascinating. Happy to subscribe
Thanks for watching!
This is so coooool!!💪👍💪👍💪👍
GOOD ONE
Thanks for this
Certainly not for beginners. Still good though
I understand now
15:39
8:09
Horrible explanation on the SVD not gonna lie. So convoluted what you just say makes a complex problem even more complex when a convoluted concept really doesn't have an easy answer. I can see why you draw the similarities of code factorization but again the idea is not as nuance as that.