Multiple Linear Regression | Part 2 | Mathematical Formulation From Scratch

แชร์
ฝัง
  • เผยแพร่เมื่อ 28 ม.ค. 2025

ความคิดเห็น • 105

  • @saptarshisanyal6738
    @saptarshisanyal6738 2 ปีที่แล้ว +110

    I dont think mathematical explanation given in this video exist in youtube. I have found this in the book "Mathematics for Machine Learning by Marc Peter Deisenroth". This is Simply brilliant. Although, matrix differentiation part is absent, but still this is extra ordinary stuff.

    • @SidIndian082
      @SidIndian082 2 ปีที่แล้ว +2

      yes , Well said .. brilliantly explained :-)😍😍😍😇😇

    • @jamitkumar7251
      @jamitkumar7251 3 หลายเดือนก่อน +1

      @@SidIndian082 yup, the explanation was damn good...

  • @kashifhabib290
    @kashifhabib290 ปีที่แล้ว +27

    Sir I Can tell You No One Literally No One Video Can Compare through your teaching, I have seen videos of Coding Ninjas and Other Paid Lectures but nobody has got into this dept , I can literally feel Machine Learning in Front of My Imagination. Thank You So Much Sir 🙏

  • @morhadi
    @morhadi 5 หลายเดือนก่อน +10

    40:00 It is worth mentioning that for any matrix ( X ), the product ( X^T X ) is always symmetric.

  • @vibhasharma6097
    @vibhasharma6097 5 หลายเดือนก่อน +12

    nitish sir and his love for cgpa and lpa dataset is a never ending lovestory😂 anyways....by far the best explanation.... Thankyou sir.

  • @sinshr-kj1kn
    @sinshr-kj1kn 4 หลายเดือนก่อน +4

    sir bohot sahi pdhate ho aap to....pta nhi kisi ne pehle suggest kyu nhi kiya mujhe aapka channel....not only practical coding but also the theory part, you explain in detail

  • @BratatiDattaGupta-y7s
    @BratatiDattaGupta-y7s 3 วันที่ผ่านมา

    This is the best video. where I have again made myself sure that I want to become a successful data scientist.

  • @Ganeshkakade454
    @Ganeshkakade454 2 ปีที่แล้ว +2

    Hi Sir..U r truly gem person sharing such a great knowledge free..is blessing for new generation..god bless u sir..Aap hamare Guru ho Aaj Se..

  • @sudhanshumishra3677
    @sudhanshumishra3677 ปีที่แล้ว

    literally, it was the greatest explanation that I have ever seen on TH-cam. Hat's off Sir

  • @hasanrants
    @hasanrants 4 หลายเดือนก่อน +1

    thanks Nitesh for producing such high quality content.
    Completed on 12:07PM 9th September 2024

  • @SonuK7895
    @SonuK7895 2 ปีที่แล้ว +2

    Made complex things so easy . i.e, CampusX

  • @karthikmanthitta6362
    @karthikmanthitta6362 3 ปีที่แล้ว +3

    Such wonderful explanation sir, really thanks a lot♥️♥️♥️you were able to explain something which 100s of videos couldn't explain to me.

  • @a1x45h
    @a1x45h 3 ปีที่แล้ว +2

    I jumped when I understood eT*e concept. Thank you so much!!!

  • @SKILLCRYSTAL
    @SKILLCRYSTAL 2 ปีที่แล้ว +6

    Most underrated channel on utube 🥲

  • @AnkushTonde863
    @AnkushTonde863 4 หลายเดือนก่อน

    Big fan of your teaching watching this video for 2nd time to revise and brush up all my concepts and make sure I remember them and be able to derive the equation if asked me anytime, any day 🙂

  • @todaystrending1992
    @todaystrending1992 3 ปีที่แล้ว +11

    Eagerly waiting for next video😁😉. Thank you so much sir for this🙏❤️

    • @dheerajrajput7058
      @dheerajrajput7058 ปีที่แล้ว +1

      Have u achieved your dream now ? Please say yes

  • @messi0510
    @messi0510 ปีที่แล้ว

    0:30-->2:45 intuition behind MLR
    43:20-->47:45 Why gradient descent is more effective as compared to OLS?

  • @PrathamGupta-n2k
    @PrathamGupta-n2k 10 หลายเดือนก่อน

    Thank you very much for the explanation sir, I searched the whole youtube to get this mathematical explanation!

  • @anshulsharma7080
    @anshulsharma7080 2 ปีที่แล้ว

    Itna saare phle pdha kese bhaiya apne.....
    Suprrrrr se uprrrrrr vla h, wowooooooooo...

  • @nikhildonthula1395
    @nikhildonthula1395 8 หลายเดือนก่อน

    Man top class stuff, been trying to find mathematical derivation from many days in TH-cam.

  • @sovansahoo27
    @sovansahoo27 ปีที่แล้ว +2

    Superb content ,easily my semester
    saviour at IIT Kanpur...Thanks Sir

  • @nirjalkumarmahato330
    @nirjalkumarmahato330 2 ปีที่แล้ว +1

    Boss tusi great ho ❤️ struggling straight from 1 months 🙃

  • @core4032
    @core4032 2 ปีที่แล้ว +1

    step by step series in very detail , superb .

  • @HayyaFatima-i4c
    @HayyaFatima-i4c หลายเดือนก่อน

    Matchless way of teaching

  • @Oxino-h4d
    @Oxino-h4d ปีที่แล้ว +2

    34:18 shouldn't its differentiation be equal to (X^T)y which is transpose of X times y instead of transpose of y times X which is (y^T)X.

  • @usmanriaz6241
    @usmanriaz6241 ปีที่แล้ว

    you are an amazing teacher. never saw with such good explanations on youtube. Love from pakistan

  • @abuboimofo6605
    @abuboimofo6605 10 หลายเดือนก่อน +2

    Doubt Sir ji 🙏
    36:00
    When you differentiate the matrix considering Y = A'XA
    Your answer is 2XA^T ..while the answer should be dY/dA = 2XA not the transpose .. correct me if I am wrong plzz

    • @SuramaDutta
      @SuramaDutta 4 หลายเดือนก่อน

      Yes, I have the same query as well. In the gatsby document also 2AX is mentioned.

  • @shubhankarsharma2221
    @shubhankarsharma2221 2 ปีที่แล้ว +1

    *->Transpose
    y*(XB)=(XB)*Y can be prove after taking XB=Y(hat) and put them with their respective matrices.
    y=[y1 y2 yn]
    y(hat)=[y(h)1 y(h)2...y(n)]
    putting into equation will prove both are same

  • @ashishmhatre2846
    @ashishmhatre2846 2 ปีที่แล้ว +3

    My master's prof. can't explain things better than you ! Thank you for making such awesome videos !

  • @shanmukhan-by2ld
    @shanmukhan-by2ld หลายเดือนก่อน

    Simple and elegant explanation.

  • @ArpanChandra-vv8cg
    @ArpanChandra-vv8cg 6 หลายเดือนก่อน

    GOD LEVEL TEACHING SKILL 💡💡

  • @sameergupta3067
    @sameergupta3067 2 ปีที่แล้ว

    This ML series is making me interested in maths of Machine Learning algorithms.

  • @kiran__jangra
    @kiran__jangra 8 หลายเดือนก่อน +1

    No Doubt sir your teaching is so fantastic I am following your videos
    Sir I have one doubt in the step where you do
    X^TXB^T=y^TX
    B^T=Y^TX(X^TX)^-1
    but it shouldn't be B^T=(X^TX)^-1Y^TX basically inverse term must be pre multiplied
    because we pre multiply by inverse term to cancel it out on left hand side and matrix multiplication is not commutative so we can't write it on other side
    Please clear my doubt

    • @prashantmaurya9635
      @prashantmaurya9635 4 หลายเดือนก่อน

      that what I thought, it is wrong in the video

  • @beethoven6185
    @beethoven6185 ปีที่แล้ว

    Sir you are the best teacher

  • @BAMEADManiyar
    @BAMEADManiyar ปีที่แล้ว +1

    37:23 sir i have a doubt if i multiply LHS by (X`T X)'-1 i will be left with B'T on LHS so therefore i need to multiply RHS also by same term right. But if i do so i'll get some other answer. why is this sir.

  • @manandesai6404
    @manandesai6404 6 หลายเดือนก่อน

    You are Genius!

  • @ADESHKUMAR-yz2el
    @ADESHKUMAR-yz2el ปีที่แล้ว

    love you sir, with all respect.

  • @manujkumarjoshi9342
    @manujkumarjoshi9342 ปีที่แล้ว

    Beautiful, luckily I know it before but awesome teaching skills

  • @ParthivShah
    @ParthivShah 10 หลายเดือนก่อน +1

    Thank You Sir.

  • @rockykumarverma980
    @rockykumarverma980 4 หลายเดือนก่อน

    Thank you so much sir 🙏🙏🙏

  • @SyedHuzafaAli
    @SyedHuzafaAli 5 หลายเดือนก่อน +1

    Mathematical Explaination☠☠ Code😊😊

  • @Adarshhb767
    @Adarshhb767 2 หลายเดือนก่อน

    no one made the maths this interesting until today

  • @user-oq1yk2fq2f
    @user-oq1yk2fq2f 4 หลายเดือนก่อน

    Thank you so much!

  • @MukeshAmareshThakur-ut4in
    @MukeshAmareshThakur-ut4in 2 หลายเดือนก่อน

    35:40 where is video of matrix differentiation , there is no link is description

  • @Sara-fp1zw
    @Sara-fp1zw 3 ปีที่แล้ว +3

    36:00 bhiya kindly upload video on matrix differentiation.

  • @gauravpundir97
    @gauravpundir97 2 ปีที่แล้ว

    Thank you for making such fantastic videos!

  • @lvilligosalvs2708
    @lvilligosalvs2708 ปีที่แล้ว

    You are a Gem, Sir. Keep it up. Thank you!

  • @shovobhai
    @shovobhai หลายเดือนก่อน

    Bhai did you make video about matrix differention you mention in the vidoe ? If yes kindly please provide the link

  • @mr.deep.
    @mr.deep. 2 ปีที่แล้ว +2

    best explain

  • @sachin2725
    @sachin2725 2 ปีที่แล้ว +1

    Hello Sir, XGBoost is not included in playlist, could you please make a video on XGBoost ?

  • @maths_impact
    @maths_impact 2 ปีที่แล้ว +1

    Wonderful sir

  • @rupeshramancreation5554
    @rupeshramancreation5554 ปีที่แล้ว

    Bahat badi baat bool di aj ap ne

  • @abhishekkukreja6735
    @abhishekkukreja6735 2 ปีที่แล้ว +1

    Hi Nitish sir, while calculating the error function we used differenctiation to get the expression but in the very beginnning you said we don't use calculus for ols and for gradient descent we do but we used that in both so how it is closed form or non closed form? whats the concept I got it , but closed and non closed form how they're diff as we're doing differentiation in both of them ?
    Thanks for these videos.

    • @spynom3070
      @spynom3070 2 ปีที่แล้ว +1

      he used calculus to show how ols equation is formed from scratch. In ols machine use final equation to calculate best fit line but in gradient descent it use calculus to reach minima point.

    • @abhishekkukreja6735
      @abhishekkukreja6735 2 ปีที่แล้ว

      @@spynom3070 thanks for this.

  • @anant1803
    @anant1803 2 ปีที่แล้ว

    Really amazing video sir.

  • @priyadarshichatterjee7933
    @priyadarshichatterjee7933 ปีที่แล้ว

    SIr... shouldnot after differentiation and reduction we will be left with yT=XTBT which again transposed gives y=XB and there fore B=X^-1y?

  • @ronylpatil
    @ronylpatil 2 ปีที่แล้ว

    Please make detail video on matrix diff.

  • @bruhat15
    @bruhat15 6 หลายเดือนก่อน +1

    13:09 matrix multiplication is wrong
    When X is multiplied by B it should give a matrix of single column
    While the matrix described earlier is of m coloums

    • @bruhat15
      @bruhat15 6 หลายเดือนก่อน

      I guess the Y hat(matrix before decomposing) should be equal sum of each columns as ine column so Y hat will be of n*1 order
      Then it would be correct

    • @titan_471
      @titan_471 2 หลายเดือนก่อน +2

      Yeah he just forgot to put addition sign in between terms

    • @talentgrowers
      @talentgrowers 17 วันที่ผ่านมา

      @@titan_471 Yes , you're right , He should have put + sign between the terms

  • @hagosohani
    @hagosohani 2 หลายเดือนก่อน

    sir apny matrix differentiation ka video nahi dala hai

  • @animatrix1631
    @animatrix1631 ปีที่แล้ว

    Matrix differentiation video please upload sir @campusx

  • @rubalsingh4018
    @rubalsingh4018 ปีที่แล้ว

    Thank you so much.

  • @balrajprajesh6473
    @balrajprajesh6473 2 ปีที่แล้ว

    Thank you for this sir!

  • @ashishraut7526
    @ashishraut7526 2 ปีที่แล้ว

    bosss that was awesome

  • @abdulmanan17529
    @abdulmanan17529 ปีที่แล้ว

    Math guru as well as machine learning

  • @kidscreator2268
    @kidscreator2268 3 ปีที่แล้ว +1

    1 no. sir

  • @priyam8665
    @priyam8665 4 หลายเดือนก่อน

    done ✅

  • @rahulpathak8415
    @rahulpathak8415 8 หลายเดือนก่อน

    Loss function 1/2m se start hota hai sir ?

  • @Jc12x06
    @Jc12x06 2 ปีที่แล้ว +1

    @CampusX Can someone explain what would happen if the Inverse doesn't exist for that particular matrix in the last step(X^T.X)^-1 i.e. if the determinant is 0?

    • @yashwanthyash1382
      @yashwanthyash1382 ปีที่แล้ว

      Very nice question but iska answer mujhe bhi nahi patha

    • @rounaksarkar3084
      @rounaksarkar3084 ปีที่แล้ว

      The reason is simple. See, X is the matrix consisting of features. Now, there are 2 possibilities for non-existance of the inverse of (X^T.X)^-1; first one is X is a null matrix and hence X^T is also a null matrix; second possibility is X^T.X is a null matrix (but none of them is individually null). You can skip the first possibility because if feature matrix is null nobody cares about the problem. Coming to the 2nd possibility, X is a (nx1) and X^T is a (1xn) matrix; X^T.X will be a (1x1) matrix. Now even if some elements of X^T are negative , it will be multiplied with the same element of X ( Notice : ith element of the 1st row of X^T == ith element of the 1st column of X ) . Hence while multiplicating and adding the elements while performing X^T.X you will never come across any negative element. So, addition of all positive quantity will give you a positive (1x1) matrix. Hence, inverse of X^T.X will always exist.

  • @mr.deep.
    @mr.deep. 2 ปีที่แล้ว +5

    campusX > MIT

  • @HirokiKudoGT
    @HirokiKudoGT ปีที่แล้ว +1

    sir , at 36:26 i think you used d/dA( A^TxA) = 2xA^T but its 2xA.... so i'm little confused about the last final result 🫤..only this thing else everything , you are great sir ...love your videos .

    • @rounaksarkar3084
      @rounaksarkar3084 ปีที่แล้ว

      Actually na, the differentiation is (X+X^T).A^T. If X=X^T then it becomes 2xA^T.

  • @ayushbaranwal1094
    @ayushbaranwal1094 ปีที่แล้ว

    Sir actually I had a doubt, d/da of At*x*A is 2XA but you have written 2XA transpose, can you explain it?

  • @ritujawale10
    @ritujawale10 2 ปีที่แล้ว

    Thank you sir 👍

  • @AbdurRahman-lv9ec
    @AbdurRahman-lv9ec 2 ปีที่แล้ว

    Awsome

  • @harshitmishra394
    @harshitmishra394 11 หลายเดือนก่อน

    @campusx sir Y(hat)=B0+B.X1+B2.X2......Bn.Xn tha to matrix me different element kaise ho gya????

  • @Moiz_tennis
    @Moiz_tennis 2 ปีที่แล้ว +2

    The video is awesome. I have a doubt though. At 37:35 you premultiply the inverse on the LHS but post multiply on RHS. Isn't that wrong? Correct me if I am missing something

    • @readbhagwatgeeta3810
      @readbhagwatgeeta3810 2 ปีที่แล้ว +2

      Yes correct ..The final value of beta should be: (( X transpose)Y)((X transpose)X)^-1

    • @Moiz_tennis
      @Moiz_tennis 2 ปีที่แล้ว

      @@readbhagwatgeeta3810 thnxx

  • @YashGaneriwal-je6rh
    @YashGaneriwal-je6rh 4 หลายเดือนก่อน

    done

  • @souviknaskar631
    @souviknaskar631 ปีที่แล้ว

    (AT)-1 = (A-1)T
    using this formula you can prove the last part
    [(XTX)-1]T = (XTX)-1

  • @roktimjojo5573
    @roktimjojo5573 2 ปีที่แล้ว

    matrix differentiation ki video kab aaegi

  • @abdulmanan17529
    @abdulmanan17529 ปีที่แล้ว

  • @MAyyan-gb6hi
    @MAyyan-gb6hi ปีที่แล้ว

    ily!

  • @vanshshah6418
    @vanshshah6418 2 ปีที่แล้ว

    (best best best best ......best)^best

  • @Khan-ho5yd
    @Khan-ho5yd ปีที่แล้ว

    Hell sir Do you have notes ??

  • @abdulmanan17529
    @abdulmanan17529 ปีที่แล้ว

    🎉🎉🎉❤

  • @Star-xk5jp
    @Star-xk5jp ปีที่แล้ว

    Day4
    Date:12/1/24

  • @RahulRaj-nw6rr
    @RahulRaj-nw6rr 4 หลายเดือนก่อน

    4:03 Universal problem 🤣