------------ TIME STAMP ------------- In the first course on Linear Algebra we look at what linear algebra is and how it relates to data. Then we look through what vectors and matrices are and how to work with them. COURSE 1 MATHEMATICS FOR MACHINE LEARNING:LINEAR ALGEBRA INTRODUCTION TO LINEAR ALGEBRA AND TO MATHEMATICS FOR MACHINE LEARNING 0:00:00 Introduction Solving data science challenges with mathemaatics 0:02:27 Motivations for linear algebra 0:05:57 Getting a handle on vectors 0:15:03 Operations with vectors 0:26:32 Summary VECTORS ARE OBJECTS THAT MOVE AROUND SPACE 0:27:37 Introduction to module 2 - Vectors 0:28:27 Modulus & inner product 0:38:28 Cosine & dot product 0:44:21 Project 0:51:09 changing basis 1:02:34 Basis, Vector space, and linear independence 1:06:47 Application of changing basis 1:10:16 Summary MATRICES IN LINEAR ALGEBRA:OBJECTS THAT OPERATE ON VECTORS 1:11:36 Matrices, Vectors, and solving simultaneous equation problems 1:17:08 How matrices transform space 1:22:49 Types of matrix transformation 1:31:28 Composition or combination of matrix transformations 1:40:28 Solving the apples and bananas problem Gaussian elimination 1:48:29 Going from Gaussian elimination to finding the inverse matrix 1:57:07 Determinants and inverses 2:07:44 Summary MATRICES MAKE LINEAR MAPPINGS 2:08:43 Introduction Eintein summation convention and the symmetry of the dot product 2:18:37 Matrices changing basis 2:29:52 Doing a transformation in a changed basis 2:34:30 Orthogonal matrices 2:41:10 The Gram-Schmidt process 2:47:18 Example Reflecting in a plane EIGENVALUES AND EIGENVECTORS:APPLICATION TO DATA PROBLEMS 3:01:28 Welcome to Module 5 3:02:20 What are eigenvalues and eigenvectors 3:06:45 Special eigen-cases 3:10:17 Calculating eigenvectors 3:20:25 Changing to the eigenbasis 3:26:17 Eigenbasis example 3:33:43 Introduction to PageRank 3:42:27 Summary 3:43:42 Wrap up of this linear algebra course ---------------------------------------------- The second course, Multivariate Calculus, builds on this to look at how to optimize fitting functions to get good fits to data. It starts from introductory calculus and then uses the matrices and vectors from the first course to look at data fitting. Course 2 MULTIVARIATE CALCULUS WHAT IS CALCULUS? 3:45:39 Welcome to Multivariate Calculus 3:47:29 Welcome to Module 1 3:48:33 Functions 3:52:51 Rise Over Run 3:57:48 Definition of a derivative 4:08:30 Differentiation example & special cases 4:16:19 Product rule 4:20:27 Chain rule 4:25:50 Taming a beast 4:31:29 See you next module! MULTIVARIATE CALCULUS 4:32:09 Welcome to Module2! 4:33:13 Variables, constants & context 4:41:09 Differentiate with respect to anything 4:45:53 The Jacobian 4:51:42 Jacobian applied 4:58:05 The Sandpit 5:02:48 The Hessian 5:08:27 Reality in hard 5:13:04 See you next module! MULTIVARIATE CHAIN RULE AND ITS APPLICATIONS 5:13:28 Welcome to Module 3! 5:14:04 Multivariate chain rule 5:16:43 More multivariate chain rule 5:22:21 Simple neural networks 5:28:13 More simple neural networks 5:32:25 See you next module! TAYLOR SERIES AND LINEARISATION 5:32:59 Welcome to Module! 5:33:35 Building approximate functions 5:37:03 Power Series 5:40:41 Power series derivation 5:49:50 Power series datails 5:56:04 Examples 6:01:24 Linearisation 6:06:41 Multivariate Taylor 6:13:08 See you next module! INTRO TO OPTIMISATION 6:13:36 Welcome to Module 5! 6:21:51 Gradient Descent 6:30:58 constrianed optimisation 6:39:32 See you next module! REGRESSION 6:41:40 simple linear regression 6:51:52 General non linear least squares 6:59:05 Doing least squares regression analysis in practice 7:05:24 Wrap up of this course ------------------------------------------------------- The third course, Dimensionality Reduction with Principal Component Analysis, uses the mathematics from the first two courses to compress high-dimensional data. This course is of intermediate difficulty and will require Python and numpy knowledge. COURSE 3 Mathematics for Machine Learning: PCA STATISTICS OF DATASETS 7:06:12 Introduction to the Course 7:09:59 Welcome to module 1 7:10:41 Mean of a dataset 7:14:41 Variance of one-dimensional datasets 7:19:36 Variance of higher -dimensional datasets 7:24:52 Effect on the mean 7:29:38 Effect on the (co)variance 7:33:08 See you next module! INNER PRODUCTS 7:33:35 Welcome to module 2 7:35:24 Dot product 7:40:07 Inner product definition 7:45:09 Inner product length of vectors 7:52:17 Inner product distance between vectors 7:55:59 Inner product angles and orthogonality 8:01:41 Inner product of functions and random variables (optional) 8:09:03 Heading for the next module! ORTHOGONAL PROJECTIONS 8:09:38 Welcome to module 3 8:10:19 Projection onto ID subspaces 8:18:02 Example projection onto ID Subspaces 8:12:28 Projections onto higher-dimensional subspaces 8:30:01 Example projection onto a 2D subspaces 8:33:53 This was module 3! PRINCIPAL COMPONENT ANALYSIS 8:34:26 Welcome to module 4 8:35:35 Problem setting and PCA objective 8:43:20 Finding the coordinates of the projected data 8:48:49 Reformulation of the objective 8:59:15 Finding the basis vectors that span the principal subspace 9:06:55 Steps of PCA 9:11:02 PCA in high dimensions 9:16:51 Other interpretations of PCA (optional) 9:24:33 Summary of this module 9:25:16 This was the course on PCA
Thank you so much for the video! For anyone wondering, the image is mirrored: check out 1:25:40, his (actual) right hand is the left hand in the video (on the side of the wrist watch)
Thank you! :) I clicked solely because of the very honest title of this video. MATHEMATICS for Machine Learning. That's how you generate the right "expectation" of students for a certain lecture/topic.
1:30:12 Why some sources said the opposite? Even chatGPT. If counter-clockwise its positive not negative. But there, you said the positive one is the clockwise movement.
@7:24:47 in the bottom right of the screen the var(D) is calculated. If I hear is correctly, the speaker mentions it as being the covariance matrix. Is this correct? Without a 2nd index it can only be the variance like also written on the screen.
Thank you. This technique of writing backwards from behind a glass wall is distracting, it is so unreal and the handwriting becomes worse and smaller with time as fatigue sets in. But many people like it. And also this course is free 🙂so no need to complain.
Suddenly changes vector meaning from House [20 (cost), 40 (area), 45(heating)] to spatial r, s etc. Where is i and j for the house example? So the price or heating has two orthogonal components? He skips over that point cleverly and sticks to high school known routes of spatial vectors where i and j are x and y - obviously. Its not obvious for the price, heating. But he avoids that or says "its the same thing". Teh later r1s + r2s = r.s - lol, he's mad.
I don’t know about EdX but I recently completed this specialization (free track) on Coursera. Next up is Machine Learning Specialization. But I’m taking a diversion into some Generative AI courses.
@@joelausten first start by the new video from @freecodecamp "linear algebra for ml, dl, gen ai" and solve easy to medium questions to get a good understanding of topics and build confidence Don't try to grasp all these in one go,,, give this linear algebra a good amount of time, divide all the topics in different sections and complete it in 1-2 months Then pick the "statistics for ds" from @datatab ( yt - both inferential and descriptive ) and understand the topics After that pick calculus , for calculus I didn't find any good yt video, that's why, I will suggest you to learn from books which has good amount of questions
can anyone tell me what should i learn in maths for coding? Iskill am working on statistics and probability.. do you need to know algebra, trigonometry, etc? As for simple interests and many more, we can just use the formula from chatgpt or Google.. I am new here in coding.. I am in my 2nd year of BA.. can someone tell me if I need to have a Btech or IT background or just skill and certificate matters? i am doing certificate course in web dev and dsa
th-cam.com/video/rm9SYX4MDu0/w-d-xo.html&ab_channel=CarlosFernandez-Granda Carlos teaches statistics for Data Science and for me his way of teaching just works.
Jo Bharat ko pahle se kaafi jyada economically, defence, international stage pe top jo bhi kar sakta he, chahe BJP or Congress, isme vote mangne ke lie koi sharm ki baat nahi, hum public ko hamara Bharat top 5 me chahie, so asking for vote is not a sin for any great son or daughter of India who is keep working day n night to make India develop 🇮🇳
------------ TIME STAMP ------------- In the first course on Linear Algebra we look at what linear algebra is and how it relates to data. Then we look through what vectors and matrices are and how to work with them. COURSE 1 MATHEMATICS FOR MACHINE LEARNING:LINEAR ALGEBRA INTRODUCTION TO LINEAR ALGEBRA AND TO MATHEMATICS FOR MACHINE LEARNING 0:00:00 Introduction Solving data science challenges with mathemaatics 0:02:27 Motivations for linear algebra 0:05:57 Getting a handle on vectors 0:15:03 Operations with vectors 0:26:32 Summary VECTORS ARE OBJECTS THAT MOVE AROUND SPACE 0:27:37 Introduction to module 2 - Vectors 0:28:27 Modulus & inner product 0:38:28 Cosine & dot product 0:44:21 Project 0:51:09 changing basis 1:02:34 Basis, Vector space, and linear independence 1:06:47 Application of changing basis 1:10:16 Summary MATRICES IN LINEAR ALGEBRA:OBJECTS THAT OPERATE ON VECTORS 1:11:36 Matrices, Vectors, and solving simultaneous equation problems 1:17:08 How matrices transform space 1:22:49 Types of matrix transformation 1:31:28 Composition or combination of matrix transformations 1:40:28 Solving the apples and bananas problem Gaussian elimination 1:48:29 Going from Gaussian elimination to finding the inverse matrix 1:57:07 Determinants and inverses 2:07:44 Summary MATRICES MAKE LINEAR MAPPINGS 2:08:43 Introduction Eintein summation convention and the symmetry of the dot product 2:18:37 Matrices changing basis 2:29:52 Doing a transformation in a changed basis 2:34:30 Orthogonal matrices 2:41:10 The Gram-Schmidt process 2:47:18 Example Reflecting in a plane EIGENVALUES AND EIGENVECTORS:APPLICATION TO DATA PROBLEMS 3:01:28 Welcome to Module 5 3:02:20 What are eigenvalues and eigenvectors 3:06:45 Special eigen-cases 3:10:17 Calculating eigenvectors 3:20:25 Changing to the eigenbasis 3:26:17 Eigenbasis example 3:33:43 Introduction to PageRank 3:42:27 Summary 3:43:42 Wrap up of this linear algebra course ---------------------------------------------- The second course, Multivariate Calculus, builds on this to look at how to optimize fitting functions to get good fits to data. It starts from introductory calculus and then uses the matrices and vectors from the first course to look at data fitting. Course 2 MULTIVARIATE CALCULUS WHAT IS CALCULUS? 3:45:39 Welcome to Multivariate Calculus 3:47:29 Welcome to Module 1 3:48:33 Functions 3:52:51 Rise Over Run 3:57:48 Definition of a derivative 4:08:30 Differentiation example & special cases 4:16:19 Product rule 4:20:27 Chain rule 4:25:50 Taming a beast 4:31:29 See you next module! MULTIVARIATE CALCULUS 4:32:09 Welcome to Module2! 4:33:13 Variables, constants & context 4:41:09 Differentiate with respect to anything 4:45:53 The Jacobian 4:51:42 Jacobian applied 4:58:05 The Sandpit 5:02:48 The Hessian 5:08:27 Reality in hard 5:13:04 See you next module! MULTIVARIATE CHAIN RULE AND ITS APPLICATIONS 5:13:28 Welcome to Module 3! 5:14:04 Multivariate chain rule 5:16:43 More multivariate chain rule 5:22:21 Simple neural networks 5:28:13 More simple neural networks 5:32:25 See you next module! TAYLOR SERIES AND LINEARISATION 5:32:59 Welcome to Module! 5:33:35 Building approximate functions 5:37:03 Power Series 5:40:41 Power series derivation 5:49:50 Power series datails 5:56:04 Examples 6:01:24 Linearisation 6:06:41 Multivariate Taylor 6:13:08 See you next module! INTRO TO OPTIMISATION 6:13:36 Welcome to Module 5! 6:21:51 Gradient Descent 6:30:58 constrianed optimisation 6:39:32 See you next module! REGRESSION 6:41:40 simple linear regression 6:51:52 General non linear least squares 6:59:05 Doing least squares regression analysis in practice 7:05:24 Wrap up of this course ------------------------------------------------------- The third course, Dimensionality Reduction with Principal Component Analysis, uses the mathematics from the first two courses to compress high-dimensional data. This course is of intermediate difficulty and will require Python and numpy knowledge. COURSE 3 Mathematics for Machine Learning: PCA STATISTICS OF DATASETS 7:06:12 Introduction to the Course 7:09:59 Welcome to module 1 7:10:41 Mean of a dataset 7:14:41 Variance of one-dimensional datasets 7:19:36 Variance of higher -dimensional datasets 7:24:52 Effect on the mean 7:29:38 Effect on the (co)variance 7:33:08 See you next module! INNER PRODUCTS 7:33:35 Welcome to module 2 7:35:24 Dot product 7:40:07 Inner product definition 7:45:09 Inner product length of vectors 7:52:17 Inner product distance between vectors 7:55:59 Inner product angles and orthogonality 8:01:41 Inner product of functions and random variables (optional) 8:09:03 Heading for the next module! ORTHOGONAL PROJECTIONS 8:09:38 Welcome to module 3 8:10:19 Projection onto ID subspaces 8:18:02 Example projection onto ID Subspaces 8:12:28 Projections onto higher-dimensional subspaces 8:30:01 Example projection onto a 2D subspaces 8:33:53 This was module 3! PRINCIPAL COMPONENT ANALYSIS 8:34:26 Welcome to module 4 8:35:35 Problem setting and PCA objective 8:43:20 Finding the coordinates of the projected data 8:48:49 Reformulation of the objective 8:59:15 Finding the basis vectors that span the principal subspace 9:06:55 Steps of PCA 9:11:02 PCA in high dimensions 9:16:51 Other interpretations of PCA (optional) 9:24:33 Summary of this module 9:25:16 This was the course on PCA
------------ TIME STAMP -------------
In the first course on Linear Algebra we look at what linear algebra is and how it relates to data.
Then we look through what vectors and matrices are and how to work with them.
COURSE 1
MATHEMATICS FOR MACHINE LEARNING:LINEAR ALGEBRA
INTRODUCTION TO LINEAR ALGEBRA AND TO MATHEMATICS FOR MACHINE LEARNING
0:00:00 Introduction Solving data science challenges with mathemaatics
0:02:27 Motivations for linear algebra
0:05:57 Getting a handle on vectors
0:15:03 Operations with vectors
0:26:32 Summary
VECTORS ARE OBJECTS THAT MOVE AROUND SPACE
0:27:37 Introduction to module 2 - Vectors
0:28:27 Modulus & inner product
0:38:28 Cosine & dot product
0:44:21 Project
0:51:09 changing basis
1:02:34 Basis, Vector space, and linear independence
1:06:47 Application of changing basis
1:10:16 Summary
MATRICES IN LINEAR ALGEBRA:OBJECTS THAT OPERATE ON VECTORS
1:11:36 Matrices, Vectors, and solving simultaneous equation problems
1:17:08 How matrices transform space
1:22:49 Types of matrix transformation
1:31:28 Composition or combination of matrix transformations
1:40:28 Solving the apples and bananas problem Gaussian elimination
1:48:29 Going from Gaussian elimination to finding the inverse matrix
1:57:07 Determinants and inverses
2:07:44 Summary
MATRICES MAKE LINEAR MAPPINGS
2:08:43 Introduction Eintein summation convention and the symmetry of the dot product
2:18:37 Matrices changing basis
2:29:52 Doing a transformation in a changed basis
2:34:30 Orthogonal matrices
2:41:10 The Gram-Schmidt process
2:47:18 Example Reflecting in a plane
EIGENVALUES AND EIGENVECTORS:APPLICATION TO DATA PROBLEMS
3:01:28 Welcome to Module 5
3:02:20 What are eigenvalues and eigenvectors
3:06:45 Special eigen-cases
3:10:17 Calculating eigenvectors
3:20:25 Changing to the eigenbasis
3:26:17 Eigenbasis example
3:33:43 Introduction to PageRank
3:42:27 Summary
3:43:42 Wrap up of this linear algebra course
----------------------------------------------
The second course, Multivariate Calculus, builds on this to look at how to optimize fitting functions to get good fits to data.
It starts from introductory calculus and then uses the matrices and vectors from the first course to look at data fitting.
Course 2
MULTIVARIATE CALCULUS
WHAT IS CALCULUS?
3:45:39 Welcome to Multivariate Calculus
3:47:29 Welcome to Module 1
3:48:33 Functions
3:52:51 Rise Over Run
3:57:48 Definition of a derivative
4:08:30 Differentiation example & special cases
4:16:19 Product rule
4:20:27 Chain rule
4:25:50 Taming a beast
4:31:29 See you next module!
MULTIVARIATE CALCULUS
4:32:09 Welcome to Module2!
4:33:13 Variables, constants & context
4:41:09 Differentiate with respect to anything
4:45:53 The Jacobian
4:51:42 Jacobian applied
4:58:05 The Sandpit
5:02:48 The Hessian
5:08:27 Reality in hard
5:13:04 See you next module!
MULTIVARIATE CHAIN RULE AND ITS APPLICATIONS
5:13:28 Welcome to Module 3!
5:14:04 Multivariate chain rule
5:16:43 More multivariate chain rule
5:22:21 Simple neural networks
5:28:13 More simple neural networks
5:32:25 See you next module!
TAYLOR SERIES AND LINEARISATION
5:32:59 Welcome to Module!
5:33:35 Building approximate functions
5:37:03 Power Series
5:40:41 Power series derivation
5:49:50 Power series datails
5:56:04 Examples
6:01:24 Linearisation
6:06:41 Multivariate Taylor
6:13:08 See you next module!
INTRO TO OPTIMISATION
6:13:36 Welcome to Module 5!
6:21:51 Gradient Descent
6:30:58 constrianed optimisation
6:39:32 See you next module!
REGRESSION
6:41:40 simple linear regression
6:51:52 General non linear least squares
6:59:05 Doing least squares regression analysis in practice
7:05:24 Wrap up of this course
-------------------------------------------------------
The third course, Dimensionality Reduction with Principal Component Analysis,
uses the mathematics from the first two courses to compress high-dimensional data.
This course is of intermediate difficulty and will require Python and numpy knowledge.
COURSE 3
Mathematics for Machine Learning: PCA
STATISTICS OF DATASETS
7:06:12 Introduction to the Course
7:09:59 Welcome to module 1
7:10:41 Mean of a dataset
7:14:41 Variance of one-dimensional datasets
7:19:36 Variance of higher -dimensional datasets
7:24:52 Effect on the mean
7:29:38 Effect on the (co)variance
7:33:08 See you next module!
INNER PRODUCTS
7:33:35 Welcome to module 2
7:35:24 Dot product
7:40:07 Inner product definition
7:45:09 Inner product length of vectors
7:52:17 Inner product distance between vectors
7:55:59 Inner product angles and orthogonality
8:01:41 Inner product of functions and random variables (optional)
8:09:03 Heading for the next module!
ORTHOGONAL PROJECTIONS
8:09:38 Welcome to module 3
8:10:19 Projection onto ID subspaces
8:18:02 Example projection onto ID Subspaces
8:12:28 Projections onto higher-dimensional subspaces
8:30:01 Example projection onto a 2D subspaces
8:33:53 This was module 3!
PRINCIPAL COMPONENT ANALYSIS
8:34:26 Welcome to module 4
8:35:35 Problem setting and PCA objective
8:43:20 Finding the coordinates of the projected data
8:48:49 Reformulation of the objective
8:59:15 Finding the basis vectors that span the principal subspace
9:06:55 Steps of PCA
9:11:02 PCA in high dimensions
9:16:51 Other interpretations of PCA (optional)
9:24:33 Summary of this module
9:25:16 This was the course on PCA
Can you add it to description so it could be accessible
No differential equation? I thought Machine learning requires differential equation
I really wish they add this to the description box
Atleast pin this comment
A small correction - 8:21:28 Projections onto higher-dimensional subspaces
Thank you so much for the video! For anyone wondering, the image is mirrored: check out 1:25:40, his (actual) right hand is the left hand in the video (on the side of the wrist watch)
Oh and it's also explained explicitly at 2:47:35 :)
Really appreciate this as a refresher that was needed after 40 odd years. Thanks heaps for the straightforward approach and the clarity.
Thank you! :) I clicked solely because of the very honest title of this video. MATHEMATICS for Machine Learning. That's how you generate the right "expectation" of students for a certain lecture/topic.
yup the keyword machine learning makes me motivated, this video is legend anyways.
Is this worthfull to you to learn machine learning
Deep Learning Specialization
th-cam.com/play/PLtS8Ubq2bIlUOQoopGBa_F2mQvdk6QeBw.html
Kindly also share machine learning course...
This is extremely useful ! thanks a lot it helps me a lot with my Msc.
this video is incredible. thank you all very much.
Awesome lecture, great refresher for long forgotten theory
first lesson - 1:02:35
secod - 1:44:50
AWSOME GENIAL Make the things SO comprehensive and so playfully.
I learned a lot from this, thank you so much!
what lesson do you take before this, I find this hard to understand.
im in 10th grade, thank you so much for making this, i took notes for the entire 9 hour lecture and finished today.
Can you share the notes ?
yoooo shut upp, dude, ur in 10thh, dayum bruv, appreciate that!
😭😭😭 Thank a lot guys. GOD BLESS YOU.
awesome professor thanks a lot
Thank you so much, this is really helpful.
Do you have any suggestion to help me understand this math, i find it hard from the first 30 minutes, confusing.
Wow i'am impressed by the way you explain things. I have learned a lot on the complex. Please put more videos on AI.
Thank you sir. 😊 (Your videos on TH-cam and classes are great!)
Glad you like them!
these are NOT his videos -- he stole them from coursera
thanks you for this course ,,,i find it very iteressed
This is the best 9 hours you would ever spent on linear algebra
Good job. Thank you so much
How'd you understand the lesson? I find these lessons hard to understand, do you any suggestion.
Thanks you very much.
Thanks for great explain. Help me alot.
Thanks for the lecture 😊
This channel is really a savior
1:30:12 Why some sources said the opposite? Even chatGPT.
If counter-clockwise its positive not negative. But there, you said the positive one is the clockwise movement.
great lecture. thanks a lot.
Muy bueno. Felicitaciones 🎈
@7:24:47
in the bottom right of the screen the var(D) is calculated. If I hear is correctly, the speaker mentions it as being the covariance matrix. Is this correct? Without a 2nd index it can only be the variance like also written on the screen.
Great course
7:55:25 it should be sqaureroot 4 = 2
Pls add timestamps
Under rated comment
There is time stamp in comments
up
Thnx u
R u blind
little confused at 4:44:39 formula, can you explain again about alternative approach, thanks!
excellent presentation, except perhaps the last part.
4:45:53 Jacobian
02:25:00
Amazing
Why is the third course exclusive to this compilation. I don't see it on the Imperial College London channel.
Thanks 🙏
AWESOME PROFESSOR 🎉🎉🎉
Thank you.
This technique of writing backwards from behind a glass wall is distracting, it is so unreal and the handwriting becomes worse and smaller with time as fatigue sets in.
But many people like it. And also this course is free 🙂so no need to complain.
i think they just edit it by inverse mirror the video
Guys.... timestamps essential. Come onnn!
2:42:28 most funniest part, sir seems scared for a moment 🙂
Suddenly changes vector meaning from House [20 (cost), 40 (area), 45(heating)] to spatial r, s etc. Where is i and j for the house example? So the price or heating has two orthogonal components? He skips over that point cleverly and sticks to high school known routes of spatial vectors where i and j are x and y - obviously. Its not obvious for the price, heating. But he avoids that or says "its the same thing".
Teh later r1s + r2s = r.s - lol, he's mad.
Sir please give one on statistics like this towards data science not as statistician
Is there any site in where we could pay a certificate to garantize the knowdlege obtained in this course?
Thanks you!
Awesome course, i love it!
Timestamps relieves headaches😅😅
13:14 could anyone explain what are we trying to understand from here
how im confused till the 20 minutes
i love the math,
Thanks a LOTTTTTTTTT🙏🙏🙏🙏🙏🙏
A 9+ hour video needs a concept/lesson index in the run-time clock. Put one in, and I might consider watching.
he put time stamps in the comments
What an honour is your consideration
Please add timelines of topics 😊
44:06.... r.s not s.s
So basically 11th and 12th maths is required.
5:15:37
Where are we suppose to find examples to work on?
How did he wrote backwards through whole video!
Course's material please, thank you
Seriously frustrating to see no application
Which specialization is being addressed in the video? Is it a specialization on EdX or Coursera?
I don’t know about EdX but I recently completed this specialization (free track) on Coursera. Next up is Machine Learning Specialization. But I’m taking a diversion into some Generative AI courses.
@@kevinmcfarlane2752 do you remember the name of this specialization which you completed on Coursera?
@@kevinmcfarlane2752 Do you have any suggestions for Generative AI course ?
02:34:33
Frog emerge from Tª-d/d+P⁰L=E
21:44 , that's funny I'm only up 2 Js so far 😂
Is this video enough to learn machine learning?
Anyone please give a reply 😢
Yes
@@Jishanthegodev do you understand the lesson that i taught in this video? I find this hard to understand. Do you have any recommendations.
@@joelausten first start by the new video from @freecodecamp "linear algebra for ml, dl, gen ai" and solve easy to medium questions to get a good understanding of topics and build confidence
Don't try to grasp all these in one go,,, give this linear algebra a good amount of time, divide all the topics in different sections and complete it in 1-2 months
Then pick the "statistics for ds" from @datatab ( yt - both inferential and descriptive ) and understand the topics
After that pick calculus , for calculus I didn't find any good yt video, that's why, I will suggest you to learn from books which has good amount of questions
I'm a second yr clg student can i understand it
Where can I find the exercises they mention throughout the lectures?
on coursera
@@alokshandilya104 needs payment/
I find this lesson so confusing, do you have any recommendations on what to learn first.
can anyone tell me what should i learn in maths for coding? Iskill am working on statistics and probability.. do you need to know algebra, trigonometry, etc? As for simple interests and many more, we can just use the formula from chatgpt or Google.. I am new here in coding.. I am in my 2nd year of BA.. can someone tell me if I need to have a Btech or IT background or just skill and certificate matters? i am doing certificate course in web dev and dsa
you dont need any math for web dev just high school level maths is more then enough
@@sugarmy9683 my question was for coding in advance level, like cybersecurity, ml, ai, deep learning, clouding etc
@@rohanrana7067 for ml,dl or some data analytics means this video much for you
Is he left handed and the image is mirrored? Or is he right handed and able to write on glass in reverse?
The image is mirrored: check out 1:25:40, he's using his right hand, which is the one with the watch
Does anyone know such a course like that for statistics
th-cam.com/video/rm9SYX4MDu0/w-d-xo.html&ab_channel=CarlosFernandez-Granda
Carlos teaches statistics for Data Science and for me his way of teaching just works.
I've just done one on Coursera. Probability and Statistics. It’s also part 3 of their Math for Machine Learning Specialization.
th-cam.com/video/LJa4_yGOmwo/w-d-xo.htmlsi=TFWaw8gZagTs20OK
where is python code?
Cool
dont tell me he was nancy pi when he was younger
Is this guy left handed or right handed? my take: left
Thanks Coursera, ehmmm I mean... Thanks My Lesson!
Already lost on the start
howww i want to understand but it doesnt even from the beginning
holy shit what a course
15:27
38:28
51:07
Jo Bharat ko pahle se kaafi jyada economically, defence, international stage pe top jo bhi kar sakta he, chahe BJP or Congress, isme vote mangne ke lie koi sharm ki baat nahi, hum public ko hamara Bharat top 5 me chahie, so asking for vote is not a sin for any great son or daughter of India who is keep working day n night to make India develop 🇮🇳
8:09:12
one question though, isn't that a panda?
Jones William White Anthony Brown Mark
Lmao he was my lecturer during my time at imperial
the car looks like alien's spaceship lmao
Jones Ruth Martin Timothy Martinez Laura
I don't like that you spent so much time selling me why I should want to learn about vectors. Almost like I'm supposed to dislike it.
Jones Larry Lopez Cynthia Anderson Barbara
White Kenneth Rodriguez Daniel Johnson Angela
Anderson Timothy Young Timothy Young Kevin
Your 'b' looks like 6
White Karen Jackson Laura Garcia Sharon
Lol
------------ TIME STAMP -------------
In the first course on Linear Algebra we look at what linear algebra is and how it relates to data.
Then we look through what vectors and matrices are and how to work with them.
COURSE 1
MATHEMATICS FOR MACHINE LEARNING:LINEAR ALGEBRA
INTRODUCTION TO LINEAR ALGEBRA AND TO MATHEMATICS FOR MACHINE LEARNING
0:00:00 Introduction Solving data science challenges with mathemaatics
0:02:27 Motivations for linear algebra
0:05:57 Getting a handle on vectors
0:15:03 Operations with vectors
0:26:32 Summary
VECTORS ARE OBJECTS THAT MOVE AROUND SPACE
0:27:37 Introduction to module 2 - Vectors
0:28:27 Modulus & inner product
0:38:28 Cosine & dot product
0:44:21 Project
0:51:09 changing basis
1:02:34 Basis, Vector space, and linear independence
1:06:47 Application of changing basis
1:10:16 Summary
MATRICES IN LINEAR ALGEBRA:OBJECTS THAT OPERATE ON VECTORS
1:11:36 Matrices, Vectors, and solving simultaneous equation problems
1:17:08 How matrices transform space
1:22:49 Types of matrix transformation
1:31:28 Composition or combination of matrix transformations
1:40:28 Solving the apples and bananas problem Gaussian elimination
1:48:29 Going from Gaussian elimination to finding the inverse matrix
1:57:07 Determinants and inverses
2:07:44 Summary
MATRICES MAKE LINEAR MAPPINGS
2:08:43 Introduction Eintein summation convention and the symmetry of the dot product
2:18:37 Matrices changing basis
2:29:52 Doing a transformation in a changed basis
2:34:30 Orthogonal matrices
2:41:10 The Gram-Schmidt process
2:47:18 Example Reflecting in a plane
EIGENVALUES AND EIGENVECTORS:APPLICATION TO DATA PROBLEMS
3:01:28 Welcome to Module 5
3:02:20 What are eigenvalues and eigenvectors
3:06:45 Special eigen-cases
3:10:17 Calculating eigenvectors
3:20:25 Changing to the eigenbasis
3:26:17 Eigenbasis example
3:33:43 Introduction to PageRank
3:42:27 Summary
3:43:42 Wrap up of this linear algebra course
----------------------------------------------
The second course, Multivariate Calculus, builds on this to look at how to optimize fitting functions to get good fits to data.
It starts from introductory calculus and then uses the matrices and vectors from the first course to look at data fitting.
Course 2
MULTIVARIATE CALCULUS
WHAT IS CALCULUS?
3:45:39 Welcome to Multivariate Calculus
3:47:29 Welcome to Module 1
3:48:33 Functions
3:52:51 Rise Over Run
3:57:48 Definition of a derivative
4:08:30 Differentiation example & special cases
4:16:19 Product rule
4:20:27 Chain rule
4:25:50 Taming a beast
4:31:29 See you next module!
MULTIVARIATE CALCULUS
4:32:09 Welcome to Module2!
4:33:13 Variables, constants & context
4:41:09 Differentiate with respect to anything
4:45:53 The Jacobian
4:51:42 Jacobian applied
4:58:05 The Sandpit
5:02:48 The Hessian
5:08:27 Reality in hard
5:13:04 See you next module!
MULTIVARIATE CHAIN RULE AND ITS APPLICATIONS
5:13:28 Welcome to Module 3!
5:14:04 Multivariate chain rule
5:16:43 More multivariate chain rule
5:22:21 Simple neural networks
5:28:13 More simple neural networks
5:32:25 See you next module!
TAYLOR SERIES AND LINEARISATION
5:32:59 Welcome to Module!
5:33:35 Building approximate functions
5:37:03 Power Series
5:40:41 Power series derivation
5:49:50 Power series datails
5:56:04 Examples
6:01:24 Linearisation
6:06:41 Multivariate Taylor
6:13:08 See you next module!
INTRO TO OPTIMISATION
6:13:36 Welcome to Module 5!
6:21:51 Gradient Descent
6:30:58 constrianed optimisation
6:39:32 See you next module!
REGRESSION
6:41:40 simple linear regression
6:51:52 General non linear least squares
6:59:05 Doing least squares regression analysis in practice
7:05:24 Wrap up of this course
-------------------------------------------------------
The third course, Dimensionality Reduction with Principal Component Analysis,
uses the mathematics from the first two courses to compress high-dimensional data.
This course is of intermediate difficulty and will require Python and numpy knowledge.
COURSE 3
Mathematics for Machine Learning: PCA
STATISTICS OF DATASETS
7:06:12 Introduction to the Course
7:09:59 Welcome to module 1
7:10:41 Mean of a dataset
7:14:41 Variance of one-dimensional datasets
7:19:36 Variance of higher -dimensional datasets
7:24:52 Effect on the mean
7:29:38 Effect on the (co)variance
7:33:08 See you next module!
INNER PRODUCTS
7:33:35 Welcome to module 2
7:35:24 Dot product
7:40:07 Inner product definition
7:45:09 Inner product length of vectors
7:52:17 Inner product distance between vectors
7:55:59 Inner product angles and orthogonality
8:01:41 Inner product of functions and random variables (optional)
8:09:03 Heading for the next module!
ORTHOGONAL PROJECTIONS
8:09:38 Welcome to module 3
8:10:19 Projection onto ID subspaces
8:18:02 Example projection onto ID Subspaces
8:12:28 Projections onto higher-dimensional subspaces
8:30:01 Example projection onto a 2D subspaces
8:33:53 This was module 3!
PRINCIPAL COMPONENT ANALYSIS
8:34:26 Welcome to module 4
8:35:35 Problem setting and PCA objective
8:43:20 Finding the coordinates of the projected data
8:48:49 Reformulation of the objective
8:59:15 Finding the basis vectors that span the principal subspace
9:06:55 Steps of PCA
9:11:02 PCA in high dimensions
9:16:51 Other interpretations of PCA (optional)
9:24:33 Summary of this module
9:25:16 This was the course on PCA
Up !
you're so cool
man man eased mine problems
hall of fame worthy............
Thank you
Thanks🤝