A really good, clear and above all PRACTICAL explanation than tons of other videos that get into the maths without explaining simply what it is and how to interpret it. THANK YOU very much!!!
What does 5:40 "add more stability" mean? Isn't a regression analysis a one - time calculation? Does this mean there is randomness in repeated runs and each run will give very different coefficient and standard error results? Or this is talking about stability across adding/removing predictors, and that adding/removing predictors is assumed to not affect coefficient and standard error results at all if there's no multicollinearity?
thank you so much your statistical lessons I have question. If X1 and X2 are highly correlated to each other. which variable should be dropped X1 or X2 plz give an answer to my question.
@@subratkumarsahoo4849 So would it be ok to produce a number of simple mixed effects regression models instead of lumping all the IVs into one model? Since my IVs are all measuring the same thing (e.g., response times) I would assume this would be ok after watching your video. Thank you so much! Also, would you recommend trying to scale numerical values so they are all similar in value? You could, for example, use more decimal points to reduce the orders of magnitude differences that sometimes occur between DVs and IVs. Keep up the good work!
Whats the rule about averaging VIF and that the averages should be near one. What if the average of all the VIFs are about 1.7? But all other rules are ok.
I am having an issue with multicollinearity in a whole plot factor of a split-plot design. N = 144, with 16 whole plots and 9 subplots in each whole plot. It is 3x3x3x2 with one hard-to-change factor (temp). Under this design, I've reached a predicted VIF of 12.37 for my whole plot factor; while other designs have been between 15-45. I've been unable to find literature that deals specifically with this issue. Do you have any recommendations? I can't increase the number of runs much more!
Hello again. I'm going to comment on your 2nd multicollinearity video as well just to point out that the VIF statistic--which you explain clearly and accurately--will not necessarily catch the kind of problem you illustrate with your v2 and v3: opposite directions, both statistically significant. I demonstrate in a peer reviewed, published research paper that VIFs can be as low as 1.1 and you still can get the problem. So low VIFs should never be used to dismiss multicollinearity concerns. Please watch my video, which touches on the VIF issue, and if interested you can download my 2018 research paper for the mathematical basis for my claim regarding the VIF. I would welcome comments from you or your many viewers. /watch?v=iV8BLOix5KI
A really good, clear and above all PRACTICAL explanation than tons of other videos that get into the maths without explaining simply what it is and how to interpret it. THANK YOU very much!!!
thanks, useful for my current phase of my research project
Dear how2stats,
Thank you so much for your very useful tutorial and all the information you provide in your website.
Thank you again!
Thank you very much, clear and copious!!!!
What does 5:40 "add more stability" mean? Isn't a regression analysis a one - time calculation? Does this mean there is randomness in repeated runs and each run will give very different coefficient and standard error results? Or this is talking about stability across adding/removing predictors, and that adding/removing predictors is assumed to not affect coefficient and standard error results at all if there's no multicollinearity?
Great. Very helpful. You're a born teacher...
How2stats Rocks!!!
Thank You very much Sir for such an enriching video :)
Very clearly explained. 👍👍👍
GOOD EXPLANATION....
Very helpful! Thanks for all your efforts
thank you so much your statistical lessons
I have question. If X1 and X2 are highly correlated to each other. which variable should be dropped X1 or X2
plz give an answer to my question.
You can drop any one of them , it will work👍
@@subratkumarsahoo4849 So would it be ok to produce a number of simple mixed effects regression models instead of lumping all the IVs into one model? Since my IVs are all measuring the same thing (e.g., response times) I would assume this would be ok after watching your video. Thank you so much!
Also, would you recommend trying to scale numerical values so they are all similar in value? You could, for example, use more decimal points to reduce the orders of magnitude differences that sometimes occur between DVs and IVs. Keep up the good work!
Whats the rule about averaging VIF and that the averages should be near one. What if the average of all the VIFs are about 1.7? But all other rules are ok.
What is the link or address of your website? I would like to view the references you spoke about in your video.
www.how2stats.net/2011/09/collinearity.html
really love all the mouth sounds..
I am having an issue with multicollinearity in a whole plot factor of a split-plot design. N = 144, with 16 whole plots and 9 subplots in each whole plot. It is 3x3x3x2 with one hard-to-change factor (temp). Under this design, I've reached a predicted VIF of 12.37 for my whole plot factor; while other designs have been between 15-45. I've been unable to find literature that deals specifically with this issue. Do you have any recommendations? I can't increase the number of runs much more!
Very helpful overview. Thanks!
Thank You so much
if we consider three variables in a multiple regression and only one variable is significant, what does this mean
That only one variable contributed to the regression equation in a statistically significant way.
Perfect explanation.
Really great videos - thanks!
Thank you for this, very clear explanation :)
Nicely done.
amazing! thank you!
thanks
thank you!
Hello again. I'm going to comment on your 2nd multicollinearity video as well just to point out that the VIF statistic--which you explain clearly and accurately--will not necessarily catch the kind of problem you illustrate with your v2 and v3: opposite directions, both statistically significant. I demonstrate in a peer reviewed, published research paper that VIFs can be as low as 1.1 and you still can get the problem. So low VIFs should never be used to dismiss multicollinearity concerns. Please watch my video, which touches on the VIF issue, and if interested you can download my 2018 research paper for the mathematical basis for my claim regarding the VIF. I would welcome comments from you or your many viewers. /watch?v=iV8BLOix5KI
It is not available (the video), the TH-cam link does not work
and I thought macroeconomy was hard to understand