So ordinary least squares is used to fit the linear regression. It defines the best line as the line that minimises the sum of the residuals squared. Would I be accurate for comparing OLS to Gradient Descent? As to say that when fitting a linear regression I am free to choose between OLS and Gradient Descent as two different optimisation algorithms?
Thank you for subscribing and following our videos. You can find the course HERE:sds.courses/python-ml-level-1
The clearest explanation of the OLS regression I heard. Thanks a lot!
Really and truly... Thanks. I appreciate this so much
So ordinary least squares is used to fit the linear regression. It defines the best line as the line that minimises the sum of the residuals squared. Would I be accurate for comparing OLS to Gradient Descent? As to say that when fitting a linear regression I am free to choose between OLS and Gradient Descent as two different optimisation algorithms?
hi bro, have you found the answer?
@turkialajmi5592 I'd like to believe it's true 👍
I just started watching this video, I'll let you guys know when i finished it