Linear Regression Part 4 | Ordinary Least Squares

แชร์
ฝัง
  • เผยแพร่เมื่อ 29 ม.ค. 2025
  • Hello Students,
    In this series, we are going to learn the different approaches to solve a problem that often encounters in our placements using C++. The following problems we are going to solve in the given series:-
    1. Introduction.
    2. Basic Intuition.
    3. General intuition of OLS.
    4. Ordinary Least Squares.
    5. Gradient Descent - Intuition.
    6. Gradient Descent - Finding the loss function and it's derivative.
    7. Gradient Descent - Learning Rate.
    8. Gradient Descent - Multi-variable Gradient Descent.
    9. Code Example.
    10. Regression Metrics, Mean Absolute Error, Root Mean Squared Error.
    11. R2 and Adjusted R2, Coefficient of Determination.
    12. Polynomial Regression, Intuition and code Example,
    About CampusX:
    CampusX is an online mentorship program for engineering students. We offer a 6-month long mentorship to students in the latest cutting - edge technologies like Machine Learning, Python, Web Development, and Deep Learning & Neural networks.
    At its core, CampusX aims to change this education system of India. We believe that high-quality education is not just for the privileged few. It is the right of everyone who seeks it. Through our mentorship program, we aim to bring quality education to every single student. A mentored student is provided with guidance on how to ace a technology through 24x7 mentorship, live and recorded video lectures, daily skill-building activities, project assignments, and evaluation, hackathons, interactions with industry experts, soft skill training, personal counseling, and comprehensive reports. All we need from you is intent, a ray of passion to learn.
    Connect with us:
    Website: www.campusx.in
    Medium Blog: / campusx
    Facebook: / campusx.official
    Linkedin: linkedin.com/company/campusx-official
    Instagram: / campusx.official
    Github: github.com/cam...
    Email: support@campusx.in

ความคิดเห็น • 21

  • @pritamsadhukhan2538
    @pritamsadhukhan2538 2 หลายเดือนก่อน +3

    Excellent teacher I have ever seen. Really worth it. Really worth it. Salute you sir to make us understand in such a easier way.

  • @purushottammitra1258
    @purushottammitra1258 4 ปีที่แล้ว +6

    Really Very "AWESOME"😊😊😊!!! I Understood OLS So Clearly For The First Time,,,Thanks A Lot👌👍👍👍👍

  • @Otaku-Chan01
    @Otaku-Chan01 ปีที่แล้ว +1

    very nice explanation sir!!
    I haven't seen any video of Linear Regression with this much clarification
    Thank you for such quality content video

  • @jelinjose1533
    @jelinjose1533 3 ปีที่แล้ว +4

    This is so well explained i have been trying hard to understand between OLS and Gradient descent.Thank you for these videos.Please do more of it .

    • @campusx-official
      @campusx-official  3 ปีที่แล้ว +1

      100 Days of Machine Learning: th-cam.com/play/PLKnIA16_Rmvbr7zKYQuBfsVkjoLcJgxHH.html

  • @awaisaslam4406
    @awaisaslam4406 ปีที่แล้ว +1

    It's Awesome..❤

  • @messedinsaan
    @messedinsaan 6 หลายเดือนก่อน

    was finding the right derivation everywhere...thanks

  • @aryanlashkari7954
    @aryanlashkari7954 10 หลายเดือนก่อน +1

    Phli bar mai kch smj ni aya
    Dusri bar mai pura smj aagya

  • @tusharverma2433
    @tusharverma2433 4 ปีที่แล้ว +1

    just one word awesome

  • @JayPatel-bo1fu
    @JayPatel-bo1fu 3 ปีที่แล้ว +1

    24 CARAT GOLD. THANK YOU SIR FOR THIS CONTENT.

  • @rambaldotra2221
    @rambaldotra2221 3 ปีที่แล้ว

    Excellent Explaination Sir ✨

  • @indranilbiswas629
    @indranilbiswas629 ปีที่แล้ว

    lovely man

  • @tharunnl7810
    @tharunnl7810 10 หลายเดือนก่อน

    can you please let me know how this formula changes when we have multiple independent
    features

  • @suchetabiswas9140
    @suchetabiswas9140 ปีที่แล้ว

    Hey can you provide complete explanation about support vector machine, like soft and hard margin and hyperplane choosing and what is the machine in support vector machine???

  • @ashishvinod2193
    @ashishvinod2193 ปีที่แล้ว

    Sir we've to use multiple linear Regression if we've more than 2 two variables.

  • @PARAGK09
    @PARAGK09 ปีที่แล้ว

    Good

  • @arshad1781
    @arshad1781 3 ปีที่แล้ว

    Thanks

  • @flower88able
    @flower88able 2 ปีที่แล้ว +2

    Hi, there is a mistake. You replaced y(i) hat with mx(i) + b but it should be replaced with mx(i) hat + b.

  • @snehasneha3822
    @snehasneha3822 3 ปีที่แล้ว

    I couldn't understand, could yu plz make math series

  • @ANILKUMAR-mn7pk
    @ANILKUMAR-mn7pk ปีที่แล้ว

    little doubtfull or wrong