could you pls explain what type of pruning is it i.e. is it cost complexity pruning like in CART or something another and why did you decide to use this method?
I am assuming you are referring to post-pruning: As I mention at 14:44, the process is called “Reduced Error Pruning”. And I used it simply because that’s the process that was described in the book I was using, namely “Fundamentals of Machine Learning for predictive data analytics”.
I’m assuming that you are referring to post-pruning: In this video, I just focus on the most basic use case of post-pruning where you build the tree with the training data, prune it with the validation data and then test it with the testing data. K-fold cross-validation is another technique on its own. It doesn’t really have something specifically to do with post-pruning. However, I think, you could also use it with post-pruning if you wanted to.
The BEST explanation and code EVER! Thank you so much, Sebastian!
Glad it was helpful!
Thank you very much for making these tutorials. Your visual presentation and general descriptions are great. I'll be watching out for future content!
Glad you like them!
Great blog post Sebastian. I am glad I figured this.
Thanks for the tutorial finally I understand pre-pruning and post-pruning!
جزاکم اللہ خیرا بھائی
Great tutorial and so structured! Amazing!
Thanks so much!
Very Informative video. Thank you for sharing it helped to solve my machine learning assignment. Waiting for more conceptual videos.
Glad it was helpful!
SUPERB EXPLANATION! THANK YOU!
Phenomenal job ❤❤❤❤❤
Great explanation!
Thank you so much! Great tutorial, it really helped me out for an exam
Great to hear!
may i know the difference between testing data and validation data?
Great explanation! Earned a sub
Thanks! I appreciate it!
could you pls explain what type of pruning is it i.e. is it cost complexity pruning like in CART or something another and why did you decide to use this method?
I am assuming you are referring to post-pruning: As I mention at 14:44, the process is called “Reduced Error Pruning”. And I used it simply because that’s the process that was described in the book I was using, namely “Fundamentals of Machine Learning for predictive data analytics”.
@@SebastianMantey oo, thanks. Now I've understood everything.
Do you do it with cross-validation?
How? What happens if at each k-fold you get a different model?
I’m assuming that you are referring to post-pruning:
In this video, I just focus on the most basic use case of post-pruning where you build the tree with the training data, prune it with the validation data and then test it with the testing data.
K-fold cross-validation is another technique on its own. It doesn’t really have something specifically to do with post-pruning. However, I think, you could also use it with post-pruning if you wanted to.
pessimistic vs optimistic pruning?
thank you for this video
nice explanation ..thanks!!!
Thank you!!
brilliant!