Basics Of Principal Component Analysis Part-1 Explained in Hindi ll Machine Learning Course

แชร์
ฝัง
  • เผยแพร่เมื่อ 21 ม.ค. 2025

ความคิดเห็น • 329

  • @artyCrafty4564
    @artyCrafty4564 ปีที่แล้ว +172

    4 imp points from video --
    1. pca solves the problem of overfitting
    2. pca reduces high dimensionality dataset to low dimensionality
    3. the number of pcs can be less than or equal to the number of attributes. although pc also depend on other factor such as dimensionality.
    4. pcs should be orthogonal that is should be independent from each other

    • @Saif1412-y1v
      @Saif1412-y1v ปีที่แล้ว

      Thanks bro

    • @SujitKUmar-gy5xr
      @SujitKUmar-gy5xr 7 หลายเดือนก่อน

      Thanks

    • @naveenrawat1549
      @naveenrawat1549 หลายเดือนก่อน

      You saved my lots of time

    • @mradnanyasir
      @mradnanyasir หลายเดือนก่อน

      5th point aj ka video bada kamaal ka hone wala hai😂😂

  • @sereto7867
    @sereto7867 2 ปีที่แล้ว +27

    Thank you for saving our career ❤️

  • @prathmeshphatake1948
    @prathmeshphatake1948 3 หลายเดือนก่อน +10

    01:09 PCA helps in overcoming the problem of overfitting caused by too many attributes and features during the training phase.
    02:18 Principal Component Analysis (PCA) helps reduce overfitting
    03:27 Principal component analysis helps in reducing overfitting by reducing dimensions and finding principal components
    04:36 Principal components can be found using views to analyze the data from different perspectives.
    05:45 The model generated two principal components: PC1 and PC2.
    06:54 Principal components can be generated from multiple attributes and reduce the dimensionality
    08:03 Give highest importance to PC1 and reduce priority for other principal components.
    09:07 Principal Component Analysis (PCA) explained in a nutshell

  • @kaustubh7304
    @kaustubh7304 4 ปีที่แล้ว +36

    Huge respects sir !!! You are surely 100% better than those University lecturers !! Because of u I can easily clear my concepts of ML, ERTOS, ICS ! Thank you so much for the help !!! I really appreciate that you are doing this with no returns and just giving away free education !! Hats off !!!!

  • @basudhasakshyarika1592
    @basudhasakshyarika1592 3 ปีที่แล้ว +268

    How come I end up finding the best teachers on TH-cam one day before my exam. Haha

    • @vishnum9613
      @vishnum9613 3 ปีที่แล้ว +41

      Because we start searching for videos only one day before the exam😂

    • @lvl-x_Esport
      @lvl-x_Esport 3 ปีที่แล้ว +18

      @@vishnum9613 right 😂 6 hours remaining and it's 3:24 am😂

    • @doctorstrange4127
      @doctorstrange4127 2 ปีที่แล้ว +6

      😂coz we are not worried about stuffs untill they are very close to us

    • @iamrichaf1616
      @iamrichaf1616 2 ปีที่แล้ว

      But why do you guys wait for the last moment??

    • @0_Somebody_1
      @0_Somebody_1 2 ปีที่แล้ว +1

      It's a talent possessed only by back benchers 😂🤣🤣

  • @johnwicckk
    @johnwicckk 2 ปีที่แล้ว +6

    He is best man!!
    Amazing learning videos. During every exam paper he is there to help.
    Thanka sir more power to you!

  • @DoomedVortex
    @DoomedVortex ปีที่แล้ว +2

    I never knew Rohit Sharma was this good at ML. Way to go champ

  • @prasanthkumar6393
    @prasanthkumar6393 2 ปีที่แล้ว +4

    100% satisfaction is guaranteed on a topic while watching your videos sir.
    Thank you so much

  • @faizejafri1014
    @faizejafri1014 5 ปีที่แล้ว +4

    Best Tutorial Found on TH-cam...!!

  • @manujpande8544
    @manujpande8544 5 ปีที่แล้ว

    Brother thanks yaar itna simple tareke se padha dete ho ki maza aa jata hai...please bhaiyon like karo isse aur subscribe bhi ....thanks yaar.

  • @Lastmomenttuitions
    @Lastmomenttuitions 5 ปีที่แล้ว +71

    good explanation buddy

    • @pradumnasoni1652
      @pradumnasoni1652 5 ปีที่แล้ว +4

      apne hi pet par laat padne par maze aare hai tumko

    • @Bhatonia_Jaat
      @Bhatonia_Jaat 4 ปีที่แล้ว +5

      at least he is supporting the better content without being arrogant! peet pe laat wali baat ni h.

    • @raghav042
      @raghav042 3 ปีที่แล้ว +1

      Wha LMT ki comment ❤️

  • @pranay6708
    @pranay6708 5 ปีที่แล้ว +6

    It was so much confusing topic and you made it so much easy.., thanks a ton sir..

  • @lunapotter5593
    @lunapotter5593 2 ปีที่แล้ว +4

    PCA:
    Need: overfitting, many attributes and features before training need to reduce
    Pca reduce overfitting, model is trying to reach every point in overfitting, High Dimensionality to low dimensionality.
    Views: from top PC1, from another point PC2, PC1 Higher priority, Pc1 and pc2 must have orthogonal property i.e. independent of each other.

  • @Baetu123
    @Baetu123 3 ปีที่แล้ว

    Thanks!

  • @alimehmood8654
    @alimehmood8654 4 ปีที่แล้ว +13

    Question: When we cast our attributes on PC1, all the attributes get casted on the line, same goes for PC2. all the points get casted on PC2. Then how are they independent? We can find the same point on PC1 as well as PC2 (my assumption).

  • @devangraut6916
    @devangraut6916 8 หลายเดือนก่อน +1

    01:09 PCA helps in overcoming the problem of overfitting caused by too many attributes and features during the training phase.
    02:18 Principal Component Analysis (PCA) helps reduce overfitting
    03:27 Principal component analysis helps in reducing overfitting by reducing dimensions and finding principal components
    04:36 Principal components can be found using views to analyze the data from different perspectives.
    05:45 The model generated two principal components: PC1 and PC2.
    06:54 Principal components can be generated from multiple attributes and reduce the dimensionality
    08:03 Give highest importance to PC1 and reduce priority for other principal components.
    09:07 Principal Component Analysis (PCA) explained in a nutshell
    Crafted by Merlin AI.

  • @muhammadiqbalbazmi9275
    @muhammadiqbalbazmi9275 5 ปีที่แล้ว +10

    Awesome Sir,
    A vigorous teacher, Quality Unmatched.

  • @benojiryasmin9174
    @benojiryasmin9174 3 ปีที่แล้ว +8

    Not only engineering...it's for also geography ❤️

  • @subrotodebnath7680
    @subrotodebnath7680 11 หลายเดือนก่อน +1

    01:09 PCA helps in overcoming the problem of overfitting caused by too many attributes and features during the training phase.
    02:18 Principal Component Analysis (PCA) helps reduce overfitting
    03:27 Principal component analysis helps in reducing overfitting by reducing dimensions and finding principal components
    04:36 Principal components can be found using views to analyze the data from different perspectives.
    05:45 The model generated two principal components: PC1 and PC2.
    06:54 Principal components can be generated from multiple attributes and reduce the dimensionality
    08:03 Give highest importance to PC1 and reduce priority for other principal components.
    09:07 Principal Component Analysis (PCA) explained in a nutshell
    Crafted by SUBROTO

  • @sam9620
    @sam9620 4 ปีที่แล้ว

    sir please continue making videos , your channel is literally a gold mine . Ap hmare US ki university k professor say kahe zeada acha para rahy ho.

    • @siddheshbandgar6927
      @siddheshbandgar6927 3 ปีที่แล้ว

      Tu US ke university proffessors se padh raha hai toh youtube pe kya kar raha hai bhai?

  • @creator025
    @creator025 5 ปีที่แล้ว +3

    I seriously don't know how you have such less subscriber , you are a life saver and obviously a good teacher/ecplainer 🙏🙏🙏
    Keep up the good work

  • @vaibhavdiwan1569
    @vaibhavdiwan1569 4 ปีที่แล้ว +5

    According to Andrew ng machine learning course use of pca should be done for increasing the speed of learning algorithm instead of preventing over fitting use regularisation to prevent overfitting

  • @ashwinbankar9
    @ashwinbankar9 ปีที่แล้ว

    Topic search karte waqt kuch topics se releted apke videos milte nhi hai lekin ek video bi agar mil jaye toh bas baat khatam chere pe ek alag khushi ho jati hai 😁😁

  • @sakshibagade7092
    @sakshibagade7092 5 หลายเดือนก่อน

    My favourite TH-cam channel is because it always reduces my stress or tension of exam😊😊

  • @rickyraina8266
    @rickyraina8266 5 ปีที่แล้ว +5

    Sir i watched each videos of your channel for my 8th sem final papers they are helping me a lot and i'm from rgpv thank you so much

  • @mr.curious1329
    @mr.curious1329 3 ปีที่แล้ว +2

    If u watch sir at 1.5x , you'll jus love the energy.
    I am already loving it ❤️ ....at 1.5x though 😂

  • @ASh-hb1ub
    @ASh-hb1ub 4 ปีที่แล้ว

    Sir ,you are really Superb..👍please continue all this.👏👏👏👏👏👏⚘⚘⚘⚘

  • @samiuddin6696
    @samiuddin6696 3 ปีที่แล้ว

    You explained a very complicated idea in very easy tips. Thanks brother. Shaanti rahain.

  • @a-archanabichkule
    @a-archanabichkule ปีที่แล้ว +2

    Happy teacher's day 💐💐

  • @sonalisingh2136
    @sonalisingh2136 5 ปีที่แล้ว +3

    Well i must appreciate......your work.......I just wanna thank you of reducing time and coming to the point .......... I want video on regularization ....plz 😄 😄 😄

  • @RAKESH-ie1vb
    @RAKESH-ie1vb ปีที่แล้ว +14

    Sir you are look like desi gamers "amit bhai" 😂

  • @sahil2pradhan
    @sahil2pradhan 2 ปีที่แล้ว

    This is video is far better than my college professors lecture.

  • @pratikpande5917
    @pratikpande5917 5 ปีที่แล้ว +6

    Thank you Sir, videos from this series helped me get a clear understanding about the concepts. Keep making such videos , they sure help a lot.

  • @nagraj0308
    @nagraj0308 4 ปีที่แล้ว

    Sir/bhaiya, you explaining things so good ..even free

  • @apurvaghodeswar9264
    @apurvaghodeswar9264 2 ปีที่แล้ว +1

    you are the best teacher

  • @sshubam
    @sshubam 2 ปีที่แล้ว

    THANKYOU SIR i just love the energy with which you teach. thankyousomuch sir you are a great teacher.

  • @saqib317
    @saqib317 ปีที่แล้ว

    Appreciate your effort. Your videos are very informative and easy to understand.

  • @hayatt143
    @hayatt143 4 ปีที่แล้ว

    Great Video. Few points needs clarification like?
    1. Why the only PC1 will b considered that means always 1 view is considered.
    2. What is the view exactly?
    3. How being orthogonal make them different? (Is it orthogonal properties ?)
    4. (Most impt) Just by having different view how the features are reduced. arn't we still putting all the features to training? (This explanation was abstract. A bit technical would have done wonders.

  • @ompandya30
    @ompandya30 ปีที่แล้ว

    salute sir !!! you explained very nice compare to university teacher i can clear my concept of ml thankyou sir hudge respects for you sir keep it up!! sir

  • @adityakumarmishra8734
    @adityakumarmishra8734 4 ปีที่แล้ว

    I came just to understand PCA but I loved your way of explaning and now I am a new subsciber.

  • @jeniajeba7230
    @jeniajeba7230 5 ปีที่แล้ว +2

    very easily explained and easy to understand , amazing ! keep up the good work :)

  • @ritikarauthan3304
    @ritikarauthan3304 2 ปีที่แล้ว +1

    5 Mins engineering, gate smashers and Sanchit sir are the life saviors 🌚🌚 lots of love....!!

  • @kaushalendrarathour9909
    @kaushalendrarathour9909 4 ปีที่แล้ว

    Sir I've watched your maximum possible videos. So I think no one is better than you.
    And now we need the video of "find S algorithm" & "candidate elimination" video.
    🙏If it's possible so please sir this is my humble request pls make this video🙏

  • @banditasahoo9663
    @banditasahoo9663 4 ปีที่แล้ว

    Very good explanation in short time... 👍👍

  • @madhushreearun1089
    @madhushreearun1089 5 ปีที่แล้ว +2

    Best and simplest possible explanation.

  • @Hayat26474
    @Hayat26474 7 หลายเดือนก่อน

    Huge respects sir !! Thank you sir

  • @MrDeepak8866
    @MrDeepak8866 5 ปีที่แล้ว +1

    why i didn't watched this before , very helpful .thankyou

  • @maxpayne880
    @maxpayne880 5 ปีที่แล้ว

    Yes sir...as time is less please only concentrate on important topics

  • @JyotiSingh-rz2gg
    @JyotiSingh-rz2gg 4 ปีที่แล้ว

    Your vedio is very very much helpful for my samester exams.
    Thanku so much sir...

  • @mohammedrehman4109
    @mohammedrehman4109 4 ปีที่แล้ว

    Very Nice and precise explanation. You did a lot of home work on PCA in making precise. Thank you

  • @seducation9982
    @seducation9982 ปีที่แล้ว

    Great explanation sir ..thank you sir ☺️

  • @minhaaj
    @minhaaj 4 ปีที่แล้ว

    well done bro. kamal videos hein.

  • @bhavya2301
    @bhavya2301 4 ปีที่แล้ว

    Amazing Effort !! 😄 😄

  • @nishiraju6359
    @nishiraju6359 4 ปีที่แล้ว

    Nice content its really helps me alot ..... Request you to keep uploading videos .. more n more .. Once again thank you so much

  • @Fatima-pp5ue
    @Fatima-pp5ue 3 ปีที่แล้ว

    good to hear such informative video

  • @5MinutesEngineering
    @5MinutesEngineering  5 ปีที่แล้ว +10

    There's a small silly mistake it's "Principal" Not "Principle"

  • @SB_Roy_Vlogs
    @SB_Roy_Vlogs ปีที่แล้ว +1

    Wow....nice

  • @freetube7767
    @freetube7767 5 ปีที่แล้ว +1

    Superb explanation.

  • @akashpal3415
    @akashpal3415 5 ปีที่แล้ว

    Very good ML content,people are giving so much money in ML course not looking in this content

  • @thedeepakmor
    @thedeepakmor 4 ปีที่แล้ว

    thanx brthr it was very helpful you gained a subscriber

  • @nalisharathod6098
    @nalisharathod6098 4 ปีที่แล้ว +3

    Please do a video in SVD (Singular Value Decomposition) . I really love your videos very useful . Thank you soo much

  • @naeemchaudry733
    @naeemchaudry733 3 ปีที่แล้ว +3

    sir you deserve the nobel prize. your way of explaining is so amazing.

  • @sumeetkaur902
    @sumeetkaur902 3 ปีที่แล้ว

    Excellent Explanation, Thank U sir

  • @sharadpkumar
    @sharadpkumar 10 หลายเดือนก่อน

    bhai maza aa gya....

  • @l2mbenop346
    @l2mbenop346 5 ปีที่แล้ว +1

    Superb Explanation !

  • @shashankparihar4984
    @shashankparihar4984 3 ปีที่แล้ว

    when you built a model using data, the model summary gives Rsq and Rsq(adj) values and difference in these values almost more than 20% tells that the model is over-fitting because as you keep adding the independent variables or attributes to the model Rsq value keeps increasing but if Rsq(adj) keeps reducing this means the added terms are not improving the model instead inflating the model. we can either drop those terms from the model which are not improving the model or we can perform principal component analysis by reducing the dimensions without dropping the attributes. basic rule to select principal components (PC) from all given principal components is to see the eigen values of these components, select those PC's whose eigen values are >= 1

  • @sudarshandev6369
    @sudarshandev6369 3 ปีที่แล้ว

    awesome means awesome explaination sir thanku so much

  • @deepsant2372
    @deepsant2372 5 ปีที่แล้ว +2

    Thank you so much sir,....you are great.😍

  • @vedant6460
    @vedant6460 2 ปีที่แล้ว

    Thanks a lot for this video💯💯💯💯💯💯💯💯

  • @devr4j
    @devr4j 7 หลายเดือนก่อน

    Love From IIT Dholakpur Sir

  • @gayathri5216
    @gayathri5216 4 ปีที่แล้ว

    Thank you so much sir...very useful video sir...😊

  • @ajaymehta8289
    @ajaymehta8289 5 ปีที่แล้ว +2

    Out of multiple PC importance or focus on particular PC is based on thier Variance & not directly on 1st PC.

    • @jayantbhardwaj4694
      @jayantbhardwaj4694 5 ปีที่แล้ว

      Bhai aapko अच्छी knowledge leg rehi he. Kya aap mere sath dip per kaam karenge

  • @silparaniswain5492
    @silparaniswain5492 ปีที่แล้ว

    Mind blowing sir

  • @ShalabhBhatnagar-vn4he
    @ShalabhBhatnagar-vn4he 4 ปีที่แล้ว

    Awesome work!

  • @manishn2442
    @manishn2442 2 ปีที่แล้ว

    sir thank you very much, you explain very well.

  • @aloktiwari1109
    @aloktiwari1109 5 ปีที่แล้ว

    Good and Easy explanation. You could clarify more why PC1 is selected over PC2, There is a reason for it.

  • @rubina-hq3gc
    @rubina-hq3gc 3 ปีที่แล้ว

    best video sir its help me a lot

  • @weekendvibes468
    @weekendvibes468 3 ปีที่แล้ว

    I understood everything thank you so much 👌👌

  • @MrSoumyabrata
    @MrSoumyabrata 4 ปีที่แล้ว

    Thank you for explaining very well. I just like how simply you can explain any complex topic.

  • @siddharthpatil1879
    @siddharthpatil1879 3 ปีที่แล้ว

    Ek hi toh dil hai kitni baar jeetoge sir😂

  • @vickyrajray2952
    @vickyrajray2952 ปีที่แล้ว +1

    THANKS MANNNNNN

  • @tapanjeetroy8266
    @tapanjeetroy8266 5 ปีที่แล้ว

    Thank you sir.. You are doing a great job

  • @nagraj0308
    @nagraj0308 4 ปีที่แล้ว

    i am going like all your videos

  • @aejazvelani4213
    @aejazvelani4213 4 ปีที่แล้ว +1

    Thank you for the PCA explanation i m new to this field and AI i see many things in the syllabus are not understandable but you help it in easy way keep it up also if you can make video on bagging and bootstraping , PCA and also LDA

  • @azmatsiddique3564
    @azmatsiddique3564 5 ปีที่แล้ว +1

    ❤️thank you sir..great explanation

  • @girijaprasadpatnaik2113
    @girijaprasadpatnaik2113 3 ปีที่แล้ว

    Love you brother 🌻🌻🌻

  • @poojamankar
    @poojamankar 5 ปีที่แล้ว

    Thanks for sharing.... Good efforts

  • @animationcrust1993
    @animationcrust1993 4 ปีที่แล้ว +1

    Thank you sir ☺️🙏

  • @VarunDeep04
    @VarunDeep04 5 ปีที่แล้ว +3

    You said, we have to give importance to PC1 first out of many PCs . My question is why PC1 is more important and why the importance decreases as we go down to PC2 , PC3 and so on.

    • @jimmymathew8540
      @jimmymathew8540 5 ปีที่แล้ว +3

      because PC1 has highest variance of data and pc2 has lesser than pc1. Varience is decreased as we go from PC1,PC2 and so on.....

    • @analogica6332
      @analogica6332 5 ปีที่แล้ว +1

      Finally someone asked the right questions.

  • @loknathbehera6557
    @loknathbehera6557 5 ปีที่แล้ว +1

    waah be shridhar kya baat hai, achi video banai hai. mai yaad hu ya bhul gaya mujhe.

    • @shubhammude9128
      @shubhammude9128 4 ปีที่แล้ว +1

      Bhai shridhar badaa aadmi ho gya abhi ;-) mai yaad hu kya bhai ?

  • @parthprajapati3487
    @parthprajapati3487 4 ปีที่แล้ว

    It was nice and simple explaination

  • @AnuragSingh-vv3qv
    @AnuragSingh-vv3qv 4 ปีที่แล้ว

    awesome i loved it...

  • @mohammadnafees9704
    @mohammadnafees9704 5 ปีที่แล้ว +1

    great explanation

  • @ManuGupta13392
    @ManuGupta13392 3 ปีที่แล้ว

    preventing overfitting is a bad use of PCA. The main reason of PCA is better visualization, spped up the process/reduce memory - Source - Andrew NG machine Learning course

  • @dineshraturi
    @dineshraturi 4 ปีที่แล้ว

    You should know projection of point on a unit vector. Here u have to find a direction (eigen vector) from covariance of your give data points. So eigen vector v1 can be ur feature in 1-D space.

  • @chintandd
    @chintandd 5 ปีที่แล้ว +1

    Great Explanation :)

  • @possibleplus2859
    @possibleplus2859 2 ปีที่แล้ว

    I would like to add, reference views you asked to look from are not actually in that way. we don't look from any random side. we look either standing on x1 dimension or standing at x2 dimension. (same as we look into x component of a vector or y component of vector. i.e. projection of 2 dimensional object on 1 dimension)

  • @GhanshyamAbrol
    @GhanshyamAbrol 4 ปีที่แล้ว

    Thanks also from Agricultural side

  • @jyotipandey1664
    @jyotipandey1664 2 ปีที่แล้ว

    The following are measurements on the test scores (X, Y) of 6 candidates for two subject examinations:
    (50, 55), (62, 92), (80, 97), (65, 83), (64, 95), (73, 93)
    Determine the first principal components for the test scores, by using Hotelling's iterative procedure.... !! Sir how to ???

  • @krunalsheth9920
    @krunalsheth9920 2 ปีที่แล้ว

    i have gone through your all videos and you explain everything exceptionally well but this time in this video I didn't got what you are trying to say I have watch this video multiple times but still it is getting difficult to understand