So grateful for the videos that you make. I have burnt my pockets, spent hours on various courses just for the sake of effective learning. But most of the times I end up coming back at campusx videos. Thank you so much.
04:15 PCA is a feature extraction technique that reduces the curse of dimensionality in a dataset. 08:30 PCA is a technique that transforms higher dimensional data to a lower dimensional data while preserving its essence. 12:45 Feature selection involves choosing the most important features for predicting the output 17:00 Feature selection is based on the spread of data on different axes 21:15 PCA is a feature extraction technique that creates new features and selects a subset of them. 25:30 PCA finds new coordinate axes to maximize variance 29:45 Variance is a good measure to differentiate the spread between two data sets. 33:54 Variance is important in PCA to maintain the relationship between data points when reducing dimensions.
It's like im watching a NF series , at first you're introduced to different terms, methods their usecases and in the last 10 mins of the video everything adds up and you realize what ahd why these stratigies are in use. Amazing.
You have done good research on every topic bro ,nice explanation ..I am so happy I found this channel at the same time feeling bad for not finding it earlier
I am loving this channel more and more everytime I see a video here.The way content is presented and created is really awesome.Keep Inspiring and motivating us.I am learning a lot here.
Top have this level of teaching, one should have deep level of understanding both from theoritcal as well as practical aspects. You have proved it again. Thank for providing such valuable teaching.
First of all the playlists is amazing you have done a really good job in explaining the concepts and intrusions behind the algorithms, I was wondering could you create a separate playlist for ARIMA SARIMAX and LSTM algorithms i really want to see those above algorithms in future class
Awesome explanation and best part is how he drops important info in between the topic, like such a good interpretation of scatter plot is in this video which i wouldn't find even in dedicated scatter plot video. So perfect.
I have a doubt, If a variable is in range 0 to 1 and another variable is in range 0 to 1000(will have more variance / spread ). Why choosing 2nd variable just by looking at variance make sense? It may be matter of units like in km and cm. For this problem we use scaling. Am I right?
sir your videos are really amazing, I had learned a lot from your videos. But I have a doubt in feature construction and feature extraction. They both are looking similar. So can you please ,tell me the one major difference between these two.
but agar PCA ke geometric intuition mai mai clockwise ghumau axis ko toh variance toh rooms ka kam ho jaega na , or agar mai same process kru by taking washroomn on x axis and rooms on y tab toh washroom select ho jaega na ??
Is it possible to have an example of pictures to classify them into two categories? If the dimensions are reduced in pca and classification in knn is better , please
sir i just wanted to ask that can we write our own machine learning algorithms instead of using sklearn and tensorflow i mean from scratch plz make a video about that. I have been following you whole series. Sir do reply. Thanks to your efforts
04:15 PCA is a feature extraction technique that reduces the curse of dimensionality in a dataset. 08:30 PCA is a technique that transforms higher dimensional data to a lower dimensional data while preserving its essence. 12:45 Feature selection involves choosing the most important features for predicting the output 17:00 Feature selection is based on the spread of data on different axes 21:15 PCA is a feature extraction technique that creates new features and selects a subset of them. 25:30 PCA finds new coordinate axes to maximize variance 29:45 Variance is a good measure to differentiate the spread between two data sets. 33:54 Variance is important in PCA to maintain the relationship between data points when reducing dimensions. Crafted by Merlin AI.
So grateful for the videos that you make. I have burnt my pockets, spent hours on various courses just for the sake of effective learning. But most of the times I end up coming back at campusx videos. Thank you so much.
ho gya to?
Same
Unbelievable...nobody taught me PCA like this.... Sir 5/5 for your teachings 🙏🙏 god bless you ❤️
I am interested with you for group study, reply me bro
04:15 PCA is a feature extraction technique that reduces the curse of dimensionality in a dataset.
08:30 PCA is a technique that transforms higher dimensional data to a lower dimensional data while preserving its essence.
12:45 Feature selection involves choosing the most important features for predicting the output
17:00 Feature selection is based on the spread of data on different axes
21:15 PCA is a feature extraction technique that creates new features and selects a subset of them.
25:30 PCA finds new coordinate axes to maximize variance
29:45 Variance is a good measure to differentiate the spread between two data sets.
33:54 Variance is important in PCA to maintain the relationship between data points when reducing dimensions.
It's like im watching a NF series , at first you're introduced to different terms, methods their usecases and in the last 10 mins of the video everything adds up and you realize what ahd why these stratigies are in use. Amazing.
You have done good research on every topic bro ,nice explanation ..I am so happy I found this channel at the same time feeling bad for not finding it earlier
Exactly. same here
Dude , your teaching is awesome...
Can't have better understanding of PCA than this..Saved so much time and energy..Thanks a lot
I am loving this channel more and more everytime I see a video here.The way content is presented and created is really awesome.Keep Inspiring and motivating us.I am learning a lot here.
Wow , i regret why I did not get to this channel, very clear as a story , i can explain a 6 year old and make him/her understand ❤️👏
Totally an Awesome playlist for learning Data Science/Mining or for ML. Thank you so much sir! Means a lot!!
No words to express how precious your teaching is....
Thank You Sir.
Top have this level of teaching, one should have deep level of understanding both from theoritcal as well as practical aspects. You have proved it again. Thank for providing such valuable teaching.
one of the finest explanation of pca I have ever seen Thankyou Sir!
First of all the playlists is amazing you have done a really good job in explaining the concepts and intrusions behind the algorithms, I was wondering could you create a separate playlist for ARIMA SARIMAX and LSTM algorithms i really want to see those above algorithms in future class
Sir , the way you explained the Curse of Dimensionality & its Solutions in Previous vedio -- Just mind blowing ..... YOU ARE GOD
Excellent sir
I have listened to different video lectures on PCA,
But i didn't understand it properly.
But your's is the best one.
Thank you so much
Never have I seen a better explanation of PCA than this!
Awesome explanation and best part is how he drops important info in between the topic, like such a good interpretation of scatter plot is in this video which i wouldn't find even in dedicated scatter plot video. So perfect.
Amazing explanation....NO one can explain pca as easily as you have done. Better than IIT professors.
Beautifully explained !!! Probably the best analogy one could come up with. Thank you, sir.
your teaching style is amazing , you are gem
I am interested with you for group study, reply me bro
Very nicely explained topics. One of the best teacher on ML.
Best Video for PCA. I'll definitely recommend to my friends 🙂
I thank God for blessing me with this teacher.
YOU ARE THE BEST TEACHER
U r outstanding for me sir...i can't able to understand untill i watch your video
Wow, how simply you did it.
bro explained what my professors couldn't do in a week in just 25 minutes
bhai ye video viral kiyu nahi ho raha hai ..thank you sir ❤
Variance of grocery shop is greater than number of rooms but you have shown reverse..
I have a doubt, If a variable is in range 0 to 1 and another variable is in range 0 to 1000(will have more variance / spread ). Why choosing 2nd variable just by looking at variance make sense? It may be matter of units like in km and cm. For this problem we use scaling. Am I right?
sir your videos are really amazing, I had learned a lot from your videos. But I have a doubt in feature construction and feature extraction. They both are looking similar. So can you please ,tell me the one major difference between these two.
I am interested with you for group study, reply me bro
Damn, you are the Messiah in ML teaching
thanks for the great explaination please keep explaining in this way only
Very nice explanation my i know which hardware you use to write on the notepad?
but agar PCA ke geometric intuition mai mai clockwise ghumau axis ko toh variance toh rooms ka kam ho jaega na , or agar mai same process kru by taking washroomn on x axis and rooms on y tab toh washroom select ho jaega na ??
You are so good in this, i m like 'tbse kha thae aap'
Best course for ML
Hi Bro,please make videos on feature selection techniques
Dear sir I am confused about the variance formula and your interpretation. Kindly recheck.
what is the difference between feature extraction and feature contruction as both are reducing the no of features?
Is it possible to have an example of pictures to classify them into two categories?
If the dimensions are reduced in pca and classification in knn is better , please
Thanks for the explanations!
Amazing Explanation
बहुत सुंदर है👍👍🙏❤️🔥
Amazing explanation... Can you share this one note for windows 10 notes of this entire series "100 days of Machine Learning"
Such an underrated channel for ML.
sir i just wanted to ask that can we write our own machine learning algorithms instead of using sklearn and tensorflow i mean from scratch plz make a video about that. I have been following you whole series. Sir do reply. Thanks to your efforts
Ha Likhsaktr ho yaar...
Yes you can...
I am interested with you for group study, reply me bro
@@vikramraipure6366 actually currently i am working on some other project so.. i am sorry..
thanks for the proposal!
My suggestion is use sklearn library for existed algorithms. If that doesn't work create your own algorithm.
great example
Nice Presentation sir
U rock dude! Really appreciate that
amazing explanation
04:15 PCA is a feature extraction technique that reduces the curse of dimensionality in a dataset.
08:30 PCA is a technique that transforms higher dimensional data to a lower dimensional data while preserving its essence.
12:45 Feature selection involves choosing the most important features for predicting the output
17:00 Feature selection is based on the spread of data on different axes
21:15 PCA is a feature extraction technique that creates new features and selects a subset of them.
25:30 PCA finds new coordinate axes to maximize variance
29:45 Variance is a good measure to differentiate the spread between two data sets.
33:54 Variance is important in PCA to maintain the relationship between data points when reducing dimensions.
Crafted by Merlin AI.
hats off to you sirrrr
Wowww!!!! Best video
Sir notes milege app ki
you are the god
Excellent
Bhai ek playlist dedo for statistical application in Data Science
Just Wow 🔥 😍
that is what we do
Amazing explanation!!
Great content!!!
Cleaver explaination
best explanation
24:24 Aha! So PCA finds an alternate co-ordiante system and uses the change of basis matrix to transform the data.
when god decided to go on earth and teach some ml concept to people , that was the same day when this man was born
This work same SVM? 🤔
Yes
@@campusx-official tq sir .. your class aswame 🙏
solid
thanks
Thanks you sir
I am interested with you for group study, reply me bro
Done....Give me your mobile no.
..... I will call i when I free
Day3:
Date:11/1/24
❤❤❤❤❤❤❤❤❤❤❤❤❤❤
1:05 Jab tak toh India bus aajad hi huya tha videsh me PCA ban chuka tha aur ☠