4 imp points from video -- 1. pca solves the problem of overfitting 2. pca reduces high dimensionality dataset to low dimensionality 3. the number of pcs can be less than or equal to the number of attributes. although pc also depend on other factor such as dimensionality. 4. pcs should be orthogonal that is should be independent from each other
01:09 PCA helps in overcoming the problem of overfitting caused by too many attributes and features during the training phase. 02:18 Principal Component Analysis (PCA) helps reduce overfitting 03:27 Principal component analysis helps in reducing overfitting by reducing dimensions and finding principal components 04:36 Principal components can be found using views to analyze the data from different perspectives. 05:45 The model generated two principal components: PC1 and PC2. 06:54 Principal components can be generated from multiple attributes and reduce the dimensionality 08:03 Give highest importance to PC1 and reduce priority for other principal components. 09:07 Principal Component Analysis (PCA) explained in a nutshell
Huge respects sir !!! You are surely 100% better than those University lecturers !! Because of u I can easily clear my concepts of ML, ERTOS, ICS ! Thank you so much for the help !!! I really appreciate that you are doing this with no returns and just giving away free education !! Hats off !!!!
PCA: Need: overfitting, many attributes and features before training need to reduce Pca reduce overfitting, model is trying to reach every point in overfitting, High Dimensionality to low dimensionality. Views: from top PC1, from another point PC2, PC1 Higher priority, Pc1 and pc2 must have orthogonal property i.e. independent of each other.
Question: When we cast our attributes on PC1, all the attributes get casted on the line, same goes for PC2. all the points get casted on PC2. Then how are they independent? We can find the same point on PC1 as well as PC2 (my assumption).
01:09 PCA helps in overcoming the problem of overfitting caused by too many attributes and features during the training phase. 02:18 Principal Component Analysis (PCA) helps reduce overfitting 03:27 Principal component analysis helps in reducing overfitting by reducing dimensions and finding principal components 04:36 Principal components can be found using views to analyze the data from different perspectives. 05:45 The model generated two principal components: PC1 and PC2. 06:54 Principal components can be generated from multiple attributes and reduce the dimensionality 08:03 Give highest importance to PC1 and reduce priority for other principal components. 09:07 Principal Component Analysis (PCA) explained in a nutshell Crafted by Merlin AI.
01:09 PCA helps in overcoming the problem of overfitting caused by too many attributes and features during the training phase. 02:18 Principal Component Analysis (PCA) helps reduce overfitting 03:27 Principal component analysis helps in reducing overfitting by reducing dimensions and finding principal components 04:36 Principal components can be found using views to analyze the data from different perspectives. 05:45 The model generated two principal components: PC1 and PC2. 06:54 Principal components can be generated from multiple attributes and reduce the dimensionality 08:03 Give highest importance to PC1 and reduce priority for other principal components. 09:07 Principal Component Analysis (PCA) explained in a nutshell Crafted by SUBROTO
According to Andrew ng machine learning course use of pca should be done for increasing the speed of learning algorithm instead of preventing over fitting use regularisation to prevent overfitting
Topic search karte waqt kuch topics se releted apke videos milte nhi hai lekin ek video bi agar mil jaye toh bas baat khatam chere pe ek alag khushi ho jati hai 😁😁
Well i must appreciate......your work.......I just wanna thank you of reducing time and coming to the point .......... I want video on regularization ....plz 😄 😄 😄
Great Video. Few points needs clarification like? 1. Why the only PC1 will b considered that means always 1 view is considered. 2. What is the view exactly? 3. How being orthogonal make them different? (Is it orthogonal properties ?) 4. (Most impt) Just by having different view how the features are reduced. arn't we still putting all the features to training? (This explanation was abstract. A bit technical would have done wonders.
salute sir !!! you explained very nice compare to university teacher i can clear my concept of ml thankyou sir hudge respects for you sir keep it up!! sir
Sir I've watched your maximum possible videos. So I think no one is better than you. And now we need the video of "find S algorithm" & "candidate elimination" video. 🙏If it's possible so please sir this is my humble request pls make this video🙏
when you built a model using data, the model summary gives Rsq and Rsq(adj) values and difference in these values almost more than 20% tells that the model is over-fitting because as you keep adding the independent variables or attributes to the model Rsq value keeps increasing but if Rsq(adj) keeps reducing this means the added terms are not improving the model instead inflating the model. we can either drop those terms from the model which are not improving the model or we can perform principal component analysis by reducing the dimensions without dropping the attributes. basic rule to select principal components (PC) from all given principal components is to see the eigen values of these components, select those PC's whose eigen values are >= 1
Thank you for the PCA explanation i m new to this field and AI i see many things in the syllabus are not understandable but you help it in easy way keep it up also if you can make video on bagging and bootstraping , PCA and also LDA
You said, we have to give importance to PC1 first out of many PCs . My question is why PC1 is more important and why the importance decreases as we go down to PC2 , PC3 and so on.
preventing overfitting is a bad use of PCA. The main reason of PCA is better visualization, spped up the process/reduce memory - Source - Andrew NG machine Learning course
You should know projection of point on a unit vector. Here u have to find a direction (eigen vector) from covariance of your give data points. So eigen vector v1 can be ur feature in 1-D space.
I would like to add, reference views you asked to look from are not actually in that way. we don't look from any random side. we look either standing on x1 dimension or standing at x2 dimension. (same as we look into x component of a vector or y component of vector. i.e. projection of 2 dimensional object on 1 dimension)
The following are measurements on the test scores (X, Y) of 6 candidates for two subject examinations: (50, 55), (62, 92), (80, 97), (65, 83), (64, 95), (73, 93) Determine the first principal components for the test scores, by using Hotelling's iterative procedure.... !! Sir how to ???
i have gone through your all videos and you explain everything exceptionally well but this time in this video I didn't got what you are trying to say I have watch this video multiple times but still it is getting difficult to understand
4 imp points from video --
1. pca solves the problem of overfitting
2. pca reduces high dimensionality dataset to low dimensionality
3. the number of pcs can be less than or equal to the number of attributes. although pc also depend on other factor such as dimensionality.
4. pcs should be orthogonal that is should be independent from each other
Thanks bro
Thanks
You saved my lots of time
5th point aj ka video bada kamaal ka hone wala hai😂😂
Thank you for saving our career ❤️
01:09 PCA helps in overcoming the problem of overfitting caused by too many attributes and features during the training phase.
02:18 Principal Component Analysis (PCA) helps reduce overfitting
03:27 Principal component analysis helps in reducing overfitting by reducing dimensions and finding principal components
04:36 Principal components can be found using views to analyze the data from different perspectives.
05:45 The model generated two principal components: PC1 and PC2.
06:54 Principal components can be generated from multiple attributes and reduce the dimensionality
08:03 Give highest importance to PC1 and reduce priority for other principal components.
09:07 Principal Component Analysis (PCA) explained in a nutshell
Huge respects sir !!! You are surely 100% better than those University lecturers !! Because of u I can easily clear my concepts of ML, ERTOS, ICS ! Thank you so much for the help !!! I really appreciate that you are doing this with no returns and just giving away free education !! Hats off !!!!
How come I end up finding the best teachers on TH-cam one day before my exam. Haha
Because we start searching for videos only one day before the exam😂
@@vishnum9613 right 😂 6 hours remaining and it's 3:24 am😂
😂coz we are not worried about stuffs untill they are very close to us
But why do you guys wait for the last moment??
It's a talent possessed only by back benchers 😂🤣🤣
He is best man!!
Amazing learning videos. During every exam paper he is there to help.
Thanka sir more power to you!
I never knew Rohit Sharma was this good at ML. Way to go champ
100% satisfaction is guaranteed on a topic while watching your videos sir.
Thank you so much
Best Tutorial Found on TH-cam...!!
Brother thanks yaar itna simple tareke se padha dete ho ki maza aa jata hai...please bhaiyon like karo isse aur subscribe bhi ....thanks yaar.
good explanation buddy
apne hi pet par laat padne par maze aare hai tumko
at least he is supporting the better content without being arrogant! peet pe laat wali baat ni h.
Wha LMT ki comment ❤️
It was so much confusing topic and you made it so much easy.., thanks a ton sir..
PCA:
Need: overfitting, many attributes and features before training need to reduce
Pca reduce overfitting, model is trying to reach every point in overfitting, High Dimensionality to low dimensionality.
Views: from top PC1, from another point PC2, PC1 Higher priority, Pc1 and pc2 must have orthogonal property i.e. independent of each other.
Thanks!
Question: When we cast our attributes on PC1, all the attributes get casted on the line, same goes for PC2. all the points get casted on PC2. Then how are they independent? We can find the same point on PC1 as well as PC2 (my assumption).
01:09 PCA helps in overcoming the problem of overfitting caused by too many attributes and features during the training phase.
02:18 Principal Component Analysis (PCA) helps reduce overfitting
03:27 Principal component analysis helps in reducing overfitting by reducing dimensions and finding principal components
04:36 Principal components can be found using views to analyze the data from different perspectives.
05:45 The model generated two principal components: PC1 and PC2.
06:54 Principal components can be generated from multiple attributes and reduce the dimensionality
08:03 Give highest importance to PC1 and reduce priority for other principal components.
09:07 Principal Component Analysis (PCA) explained in a nutshell
Crafted by Merlin AI.
Awesome Sir,
A vigorous teacher, Quality Unmatched.
aur iqbal bazmi sahab, aao kabhi room par
Not only engineering...it's for also geography ❤️
01:09 PCA helps in overcoming the problem of overfitting caused by too many attributes and features during the training phase.
02:18 Principal Component Analysis (PCA) helps reduce overfitting
03:27 Principal component analysis helps in reducing overfitting by reducing dimensions and finding principal components
04:36 Principal components can be found using views to analyze the data from different perspectives.
05:45 The model generated two principal components: PC1 and PC2.
06:54 Principal components can be generated from multiple attributes and reduce the dimensionality
08:03 Give highest importance to PC1 and reduce priority for other principal components.
09:07 Principal Component Analysis (PCA) explained in a nutshell
Crafted by SUBROTO
sir please continue making videos , your channel is literally a gold mine . Ap hmare US ki university k professor say kahe zeada acha para rahy ho.
Tu US ke university proffessors se padh raha hai toh youtube pe kya kar raha hai bhai?
I seriously don't know how you have such less subscriber , you are a life saver and obviously a good teacher/ecplainer 🙏🙏🙏
Keep up the good work
According to Andrew ng machine learning course use of pca should be done for increasing the speed of learning algorithm instead of preventing over fitting use regularisation to prevent overfitting
Topic search karte waqt kuch topics se releted apke videos milte nhi hai lekin ek video bi agar mil jaye toh bas baat khatam chere pe ek alag khushi ho jati hai 😁😁
My favourite TH-cam channel is because it always reduces my stress or tension of exam😊😊
Sir i watched each videos of your channel for my 8th sem final papers they are helping me a lot and i'm from rgpv thank you so much
Waah bc
Kitne marks paye?
@@PranavKumar1991 75 above
If u watch sir at 1.5x , you'll jus love the energy.
I am already loving it ❤️ ....at 1.5x though 😂
Sir ,you are really Superb..👍please continue all this.👏👏👏👏👏👏⚘⚘⚘⚘
You explained a very complicated idea in very easy tips. Thanks brother. Shaanti rahain.
Happy teacher's day 💐💐
Well i must appreciate......your work.......I just wanna thank you of reducing time and coming to the point .......... I want video on regularization ....plz 😄 😄 😄
Sir you are look like desi gamers "amit bhai" 😂
yas!
This is video is far better than my college professors lecture.
Thank you Sir, videos from this series helped me get a clear understanding about the concepts. Keep making such videos , they sure help a lot.
Sir/bhaiya, you explaining things so good ..even free
you are the best teacher
THANKYOU SIR i just love the energy with which you teach. thankyousomuch sir you are a great teacher.
Appreciate your effort. Your videos are very informative and easy to understand.
Great Video. Few points needs clarification like?
1. Why the only PC1 will b considered that means always 1 view is considered.
2. What is the view exactly?
3. How being orthogonal make them different? (Is it orthogonal properties ?)
4. (Most impt) Just by having different view how the features are reduced. arn't we still putting all the features to training? (This explanation was abstract. A bit technical would have done wonders.
salute sir !!! you explained very nice compare to university teacher i can clear my concept of ml thankyou sir hudge respects for you sir keep it up!! sir
I came just to understand PCA but I loved your way of explaning and now I am a new subsciber.
very easily explained and easy to understand , amazing ! keep up the good work :)
5 Mins engineering, gate smashers and Sanchit sir are the life saviors 🌚🌚 lots of love....!!
true hehe 😹
Sir I've watched your maximum possible videos. So I think no one is better than you.
And now we need the video of "find S algorithm" & "candidate elimination" video.
🙏If it's possible so please sir this is my humble request pls make this video🙏
Very good explanation in short time... 👍👍
Best and simplest possible explanation.
Huge respects sir !! Thank you sir
why i didn't watched this before , very helpful .thankyou
Yes sir...as time is less please only concentrate on important topics
Your vedio is very very much helpful for my samester exams.
Thanku so much sir...
*Video. not Vedio.
Very Nice and precise explanation. You did a lot of home work on PCA in making precise. Thank you
Great explanation sir ..thank you sir ☺️
well done bro. kamal videos hein.
Amazing Effort !! 😄 😄
Nice content its really helps me alot ..... Request you to keep uploading videos .. more n more .. Once again thank you so much
good to hear such informative video
There's a small silly mistake it's "Principal" Not "Principle"
Principle he sahi hai
Ok
Pin it sir
Wow....nice
Superb explanation.
Very good ML content,people are giving so much money in ML course not looking in this content
thanx brthr it was very helpful you gained a subscriber
Please do a video in SVD (Singular Value Decomposition) . I really love your videos very useful . Thank you soo much
sir you deserve the nobel prize. your way of explaining is so amazing.
Excellent Explanation, Thank U sir
bhai maza aa gya....
Superb Explanation !
when you built a model using data, the model summary gives Rsq and Rsq(adj) values and difference in these values almost more than 20% tells that the model is over-fitting because as you keep adding the independent variables or attributes to the model Rsq value keeps increasing but if Rsq(adj) keeps reducing this means the added terms are not improving the model instead inflating the model. we can either drop those terms from the model which are not improving the model or we can perform principal component analysis by reducing the dimensions without dropping the attributes. basic rule to select principal components (PC) from all given principal components is to see the eigen values of these components, select those PC's whose eigen values are >= 1
awesome means awesome explaination sir thanku so much
Thank you so much sir,....you are great.😍
Thanks a lot for this video💯💯💯💯💯💯💯💯
Love From IIT Dholakpur Sir
Thank you so much sir...very useful video sir...😊
Out of multiple PC importance or focus on particular PC is based on thier Variance & not directly on 1st PC.
Bhai aapko अच्छी knowledge leg rehi he. Kya aap mere sath dip per kaam karenge
Mind blowing sir
Awesome work!
sir thank you very much, you explain very well.
Good and Easy explanation. You could clarify more why PC1 is selected over PC2, There is a reason for it.
best video sir its help me a lot
I understood everything thank you so much 👌👌
Thank you for explaining very well. I just like how simply you can explain any complex topic.
Ek hi toh dil hai kitni baar jeetoge sir😂
THANKS MANNNNNN
Thank you sir.. You are doing a great job
i am going like all your videos
Thank you for the PCA explanation i m new to this field and AI i see many things in the syllabus are not understandable but you help it in easy way keep it up also if you can make video on bagging and bootstraping , PCA and also LDA
❤️thank you sir..great explanation
Can you please upload practical of pca
Love you brother 🌻🌻🌻
Thanks for sharing.... Good efforts
Thank you sir ☺️🙏
You said, we have to give importance to PC1 first out of many PCs . My question is why PC1 is more important and why the importance decreases as we go down to PC2 , PC3 and so on.
because PC1 has highest variance of data and pc2 has lesser than pc1. Varience is decreased as we go from PC1,PC2 and so on.....
Finally someone asked the right questions.
waah be shridhar kya baat hai, achi video banai hai. mai yaad hu ya bhul gaya mujhe.
Bhai shridhar badaa aadmi ho gya abhi ;-) mai yaad hu kya bhai ?
It was nice and simple explaination
awesome i loved it...
great explanation
preventing overfitting is a bad use of PCA. The main reason of PCA is better visualization, spped up the process/reduce memory - Source - Andrew NG machine Learning course
You should know projection of point on a unit vector. Here u have to find a direction (eigen vector) from covariance of your give data points. So eigen vector v1 can be ur feature in 1-D space.
Great Explanation :)
I would like to add, reference views you asked to look from are not actually in that way. we don't look from any random side. we look either standing on x1 dimension or standing at x2 dimension. (same as we look into x component of a vector or y component of vector. i.e. projection of 2 dimensional object on 1 dimension)
Thanks also from Agricultural side
The following are measurements on the test scores (X, Y) of 6 candidates for two subject examinations:
(50, 55), (62, 92), (80, 97), (65, 83), (64, 95), (73, 93)
Determine the first principal components for the test scores, by using Hotelling's iterative procedure.... !! Sir how to ???
i have gone through your all videos and you explain everything exceptionally well but this time in this video I didn't got what you are trying to say I have watch this video multiple times but still it is getting difficult to understand