Want to learn Data Science effectively and show confidence during interviews? Download the 6 -Step Strategy to master Data Science through now linear methods of learning Download Link - kunaalnaik.com/learn-data-science-youtube/
Hi Ganesh, I am opening a tight group of learners on whatsapp to actively engage and mentor. You can join if you are interested - bit.ly/DataScience_Kunaal_Whatsapp
Hello Kunaal, Nice informative video. Can I use Randomized search CV for decision tree/SVM/Logistic regression/NLP for checking hyperparameters? If yes, then the same parameter code needs to be put for all, or different parameters should be put in that hyperparameter selection bracket?
Thanks, one queation, after tuining the model and finding the best hyper parameter, it is not necessary to run the model with found best parameter for moel training right? i mean after using GridSearchCV, model is already configured by best parameter? can you please elaborate?
Yes, you should rerun the model with the best parameters. In this case, it just uses the best version for the model for doing calculations. (me being lazy :p)
hello sir, i almost used the same param_grid which you showed and i was working in Google Colab and GridSearchCV was fitting different combinations for about 3h 40m. it crashed at the end...what can or should i do to get the results?(my training dataset contained 17145 examples)... thanks
You need to apply these strategies: 1/ Outlier Detection and imputation 2/ Missing Value Imputation Strategy 3/ Transformation Strategy 4/ Cross-Validation Framework Iterate by changing the above first and check your performance.
Could hyperparameter tuning cause over fitting on the training data? Shouldn't you use a subset of the data to tune your hyperparameters (a validation set)?
Want to learn Data Science effectively and show confidence during interviews?
Download the 6 -Step Strategy to master Data Science through now linear methods of learning
Download Link - kunaalnaik.com/learn-data-science-youtube/
Coolll !!!! Very simple and effective sir
Thank you so much for the video, very well explained
Glad it was helpful!
Thanks. Crisp & clear explanation.
Glad it was helpful!
Thank you so much for the video, very well explained with simple words.
Very Helpful Video!
Thanks for posting this video! Helped a lot!!
I am glad it helped!
Simple and neat explanation brother.. sounds great ..Kindly suggest Parameters for Voting classifier
Hi Ganesh, I am opening a tight group of learners on whatsapp to actively engage and mentor. You can join if you are interested - bit.ly/DataScience_Kunaal_Whatsapp
Will try another video :)
you saved my life tonight
I appreciate your video sir...but may I ask how you chose what you wanted in the parameter grid?
Hello Kunaal, Nice informative video. Can I use Randomized search CV for decision tree/SVM/Logistic regression/NLP for checking hyperparameters? If yes, then the same parameter code needs to be put for all, or different parameters should be put in that hyperparameter selection bracket?
same
Thanks, one queation, after tuining the model and finding the best hyper parameter, it is not necessary to run the model with found best parameter for moel training right? i mean after using GridSearchCV, model is already configured by best parameter? can you please elaborate?
Yes, you should rerun the model with the best parameters. In this case, it just uses the best version for the model for doing calculations. (me being lazy :p)
@@KunaalNaik Thanks!
Si i chose a model, then i found the best parameter, then train the model with best parameter, and finally model.predict.. right?
@@amirhosseinrahimi3964 You are right :) what model are you building?
@@KunaalNaik Thaks! in a classification problem, I am using Randomforest, Logistic Regression, and Knn to compare the f1 score...
@@amirhosseinrahimi3964 Which one is best?
Thank you so much... very much appreciated.
Glad it helped
Thank you so much!
For a project
I removed multicollinearity and then dis hyperparameter tuning
Yet accuracy doesnt increase
Instead it decreased
Can jt happen?
Input contains NaN, infinity or a value too large for dtype('float64'). i got an error like this
You might want to do missing value treatment before you run this code. Check this code out - www.kaggle.com/funxexcel/titanic-solution-random-forest
Why you not passed scoring in gridsearchCV ??
Most learners get confused about it when using it for the first time :)
Question, I reached a penalty 11... so how do I fix that?
What happens if one of the x variables is nominal, for example sex(male/female)?
convert it to binary feature and then build the model.
hello sir, i almost used the same param_grid which you showed and i was working in Google Colab and GridSearchCV was fitting different combinations for about 3h 40m. it crashed at the end...what can or should i do to get the results?(my training dataset contained 17145 examples)... thanks
Hi Shahid, try reducing the number of combinations. Take only two variations of each parameter and make it run.
Hi there, I am not seeming to get more optimal performance using this, Do you have any idea why?
You need to apply these strategies:
1/ Outlier Detection and imputation
2/ Missing Value Imputation Strategy
3/ Transformation Strategy
4/ Cross-Validation Framework
Iterate by changing the above first and check your performance.
@@KunaalNaik thank you 😇
i got error as "Solver lbfgs supports only 'l2' or 'none' penalties, got l1 penalty".
lbfgs cannot be used with L1 Penalty, that why the error. When you use lbfgs you could remove L1 from it. For others you could use L1.
Thanks
Could hyperparameter tuning cause over fitting on the training data? Shouldn't you use a subset of the data to tune your hyperparameters (a validation set)?
Agree, you should use a validation set. That works for a some models like XGBoost, LGB etc... for basic model we just tune with parameters.
What am I doing wrong if after gridsearchcv my accuracy for the model is decreasing!
Stir the range of the parameters. It takes a while to get the sweet spot :)
failed to perform for categorical data
movie review classifier
Does it have more than one category? If Yes, then try some other algorithm.
Why can't see the hyper parameters after dot fit ?
Could not get you Chetan?
I thought you'd explain the meaning of the hyperparams.
This is much better than that. Amazing trick.
reached haha
use hashtags