Information Entrophy is only for classification trees, not regression. "Instead of using entropy, regression trees rely on variance reduction or mean squared error (MSE) to measure the "purity" or homogeneity of the data points within each node. The goal is to split the data into groups where the target values are as close to the group's mean as possible, minimizing the spread (variance) within each group."
Thank you very much I m a prof I work In a scientific paper using machine learning I've already used MLR and GAM models and I will use a random forest model if you are interested you can work together I look for your response
Thank you for subscribing and following our videos. You can find the course HERE:sds.courses/machine-learning-az
That was an amazing and clear explanation, especially for non-technical learners. Thank you so much
Information Entrophy is only for classification trees, not regression. "Instead of using entropy, regression trees rely on variance reduction or mean squared error (MSE) to measure the "purity" or homogeneity of the data points within each node. The goal is to split the data into groups where the target values are as close to the group's mean as possible, minimizing the spread (variance) within each group."
Thanks for pointing that out! I appreciate you taking the time to share that.
would be much better if you explained the exact steps, including background iterative calculations made, to determine each split.
Thanks a lot ! Nice video!
Thanks - interesting. Thinking about taking up the course
Are the splits always at right angles to the axes - get xn >c ?
Thank you very much I m a prof I work In a scientific paper using machine learning I've already used MLR and GAM models and I will use a random forest model if you are interested you can work together I look for your response
this is the worst explanation of DTR, YOU DONT KNOW HOW TO CLEARLY EXPLAIN IT
Your explanation was so much clearer. Thank you.
Not helpful