Beta_0 is the intercept, and Beta_1 is the slope. Recall the basic linear line, y = m*x + b, where m is the slope and b is the intercept... in other words, the constant (m) times x is the slope, and the constant by itself (b), is the intercept. Of course, y = m*x + b is the same thing as y = b + m*x. The model we are using for this linear regression is y = Beta_0 + Beta_1*x + error. Therefore, Beta_0 is the intercept, and Beta_1 is the slope. I have never seen a textbook reverse this, therefore your instructor should also be using similar notation, where Beta_0 is the intercept, and Beta_1 is the slope.
@@Stats4Everyone Sorry you are correct I realised it later... Just confused myself a bit when I wen't through your playlist on Regression... Could you please add videos on multiple regression using the matrix approach
@@takudzwamukura7172 Good! I am happy to hear that it makes sense now :-D Also, yes, adding videos on multiple regression using matrices, is definitely on my to do list. I expect to have them up over the next month or so. I still want to add more to the simple linear regression playlist first... currently working on adding a few videos on checking model assumptions by examining the residuals. I am happy to hear that you have been finding these videos on simple linear regression to be helpful!
@@Stats4Everyone Yes they are so helpful thanks a lot... I am having my exams in February and if you fail to post in time no problem I would still appreciate your work... may God bless you.
This video and others are insanely useful for beginners and advanced learners and under-viewed! Thank you so much.
thank you! exactly what I was looking for :)
Thank you soo much for the video. I understood everything 15mins
Thank you for the comment! I am so happy to hear that you found this video to be helpful :-)
also. would you please make a playlist on logistic regression as well?
at 8:27 how do we know "all these guys are constant". Please do elaborate this, I got a bit confused here :'(
and please explain why t has df n-2 ?
Heyy Michelle in your videos ain't Beta note the slope and Beta 1 the intercept?
Beta_0 is the intercept, and Beta_1 is the slope. Recall the basic linear line, y = m*x + b, where m is the slope and b is the intercept... in other words, the constant (m) times x is the slope, and the constant by itself (b), is the intercept. Of course, y = m*x + b is the same thing as y = b + m*x. The model we are using for this linear regression is y = Beta_0 + Beta_1*x + error. Therefore, Beta_0 is the intercept, and Beta_1 is the slope. I have never seen a textbook reverse this, therefore your instructor should also be using similar notation, where Beta_0 is the intercept, and Beta_1 is the slope.
@@Stats4Everyone Sorry you are correct I realised it later... Just confused myself a bit when I wen't through your playlist on Regression... Could you please add videos on multiple regression using the matrix approach
@@takudzwamukura7172 Good! I am happy to hear that it makes sense now :-D
Also, yes, adding videos on multiple regression using matrices, is definitely on my to do list. I expect to have them up over the next month or so. I still want to add more to the simple linear regression playlist first... currently working on adding a few videos on checking model assumptions by examining the residuals.
I am happy to hear that you have been finding these videos on simple linear regression to be helpful!
@@Stats4Everyone Yes they are so helpful thanks a lot... I am having my exams in February and if you fail to post in time no problem I would still appreciate your work... may God bless you.