6:03, PolynomialFeatures class for x1 and x2 and for degree 3, also adds columns (x1)(x2), (x1^2)(x2), (x1)(x2^2). Generally, for PolynomialFeatures set to degree = d, and we originally had n features, we will get a total of (n+d)! / d! n! features. Just thought this'd be useful to know.
I understood the explanation well.. thanks... but Bhai one doubt in real data set by looking at the data without plotting graph, will I be able to tell we need to use poly reg.. Also if we add features like X^2,X^3, will it not end up with multicollinearity as dependency b/w input features exists now??
Yes it adds multi-collinearity in your model by introducing x^2, x^3 features into your model. To overcome that, you can use orthogonal polynomial regression which introduces polynomials that are orthogonal to each other.
Can we apply polynomial regression on dataframe (let's say df) first then after we split x into train_x or test_x. If yes then why everyone doing split first then transform x_train then x_test.
Why in polynomial regression with degree 2 is not able to capture the bowl in the training dataset? I think that curve shown was for a higher polynomial.
fit() is used to calculate mean and variance of the data, while transform() is use to transform (scale the data according to our need) the data. When we first use fit_transform() which already calculated mean and variance from 80% of data, which is more reliable than the 20% of data i.e. X_test_trans also by doing so we keep the mean and variance of whole dataset same that's why we only use transform for X_train_trans.
If anyone wants to learn ML then it is the best channel.. explanation is amazing for all the topics.
great video bro , excellent , perfect , no words to express my gratitude. U covered all the doubts I had w.r.t to polynomial regression
we might have to use "print("Input", poly.n_features_in_)" instead of "print("Input", poly.n_input_features_)"
Watch this video with the subtitles provided by TH-cam. You will find another story in the subtitles 😅
Sir you generated data in form X and y, what about, if we have real life dataset, then how to plot data distribution?
You should have write the code from scratch for polynomial feature generation
You way of teaching is very very interesting.
Excellent. Very well explained. You should use real world data instead of random numerical values
6:03, PolynomialFeatures class for x1 and x2 and for degree 3, also adds columns (x1)(x2), (x1^2)(x2), (x1)(x2^2). Generally, for PolynomialFeatures set to degree = d, and we originally had n features, we will get a total of (n+d)! / d! n! features. Just thought this'd be useful to know.
thanks! I had the same doubt because at 5:52 sir didn't add the additional term x1*x2, so I got confused. Please correct me if I'm wrong.
@@ruchiagrawal6432 it gets features’ high-order and interaction terms. suppose (x1,x1) becomes (1,x1,x2,x1^2,x1x2,x2^2) and so on
from where/how the term (X1)(X2) coming from ? Kindly elaborate
@@subhajitdey4483 formula bro (a+b)^n where n is the degree of polynomial.
Hi Nitish bro,this audio is not clear ..please do new video if possible on same topic..also please cover fearure selection xgboost
you can use earphones, it will be understandable.
plz spend a little more time on code writing or code explanation 🙏
I understood the explanation well.. thanks... but Bhai one doubt in real data set by looking at the data without plotting graph, will I be able to tell we need to use poly reg.. Also if we add features like X^2,X^3, will it not end up with multicollinearity as dependency b/w input features exists now??
Yes it adds multi-collinearity in your model by introducing x^2, x^3 features into your model. To overcome that, you can use orthogonal polynomial regression which introduces polynomials that are orthogonal to each other.
Perfect explanation as always.
Your explanation is wonderful. I request you to kindly prepare a video to explain the code.
Can we apply polynomial regression on dataframe (let's say df) first then after we split x into train_x or test_x. If yes then why everyone doing split first then transform x_train then x_test.
1:35-->7:25 What is polynomial Linear Regression?
Thanks for the Video Sirjiiii
Why in polynomial regression with degree 2 is not able to capture the bowl in the training dataset? I think that curve shown was for a higher polynomial.
Yes even I was confused because the poly features should have fit better for degree 2
Awesome Explanation.
Why do we create X_new and y_new, while we have X_test_trans and y_pred
Same doubt
Best of the best!
One qurstion....
10:37 why we are not using fit_transform() for X_test_trans as X_train_trans ??
fit() is used to calculate mean and variance of the data, while transform() is use to transform (scale the data according to our need) the data. When we first use fit_transform() which already calculated mean and variance from 80% of data, which is more reliable than the 20% of data i.e. X_test_trans also by doing so we keep the mean and variance of whole dataset same that's why we only use transform for X_train_trans.
Thank you @@manishsinghmehra7555
Infinit time thankyou ir
Thank you sir
Avaj ni saaf
finished watching
Best, simple & easy explanation
❤
Thank u nitish 🎉
Wow
nice
❤️