If you start calculating the problem with wrong data and then to match your solution you change the data at end You are great. All the attribute values of fever calculated wrong due to count error of yes and no. you should check the videos before posting.
at 5:55 you've calculated the initial information = entropy of the parent, and not information gain. information gain for each attribute is calculated later when you subtract total information(by taking weighted entropies of the children) from Initial information, that is information gain(which you mentioned only as gain).
To value agar pehly wali e aay ap change na krty tab kia hota, hmaray exams mein is trha ka data b aaskta, fer kia bolyn ge k EEC waly bhai ne value badal dali thi.?
He is saying ...see the theory part first and then go for this Numerical part...but I didn't get the theory part of data mining😖...really inappropriate
If you start calculating the problem with wrong data and then to match your solution you change the data at end You are great. All the attribute values of fever calculated wrong due to count error of yes and no.
you should check the videos before posting.
no i did it answer is ame at the end decision tree is same
@@birendrabirbikramshah6409 what is the value for entropy of each attributes ?
Yea I am confused. I am getting a different gain value for fever...
at 5:55 you've calculated the initial information = entropy of the parent, and not information gain. information gain for each attribute is calculated later when you subtract total information(by taking weighted entropies of the children) from Initial information, that is information gain(which you mentioned only as gain).
you've to replace info gain with entropy(class)....and then that "gain" is information gain....Also its not sore thraaat
Thanks sir .. very well EXPLAINED 😊
IN fever gain value should be 0.39
Nice explanation sir. very well explained. Sir which companies pens you are used? Please reply
you have calculated entropy instead of information gain
Gain(fever) is not matching, I think it should be around 0.37-0.39
You havent modified the table as sir did
fan of you
ebtropy class value = 1.570
Its sore throat, not threat
is it id3?
Thanks a lot sir
btw Explanation os Always nice as before
Wow
Gain(Fever) = 0.3709
I got the same bro. Its correct ?
@@tumtum6910 yes
same here as well
I got exact like sir
decision tree is wrong as for one fever is yes and diagnosis is strep throat
This is an awesome lesson..
Thank you sir for explaning it way better than my college professor . P.S- sir which pen do you use? It actually looks pretty dope
At 10.38 blunder erroe. U write1.038+ 0.5272...but on calc there is 1.035+ 0.5272 the ans is 1.562.
But the exact ans comes is 1.5652
😹😹😹
what a blunder mistake .......
esi classifier k aik example jo hain wo yeh k 3 sample deye hoe hain usi ka 4th sample find karna hain but kise kry sammj nahi araha pleasae help
Its supposed to be sore throat
The formula for information gain
-p/p+n log2 (p/p+n) - n/p+n log2 (n/p+n) ?
why your formula is difference from google
formule me p+n ki jagah s rakh do same aayega
now these all are corona symptoms 😂
Sir plz check your formula
To value agar pehly wali e aay ap change na krty tab kia hota, hmaray exams mein is trha ka data b aaskta, fer kia bolyn ge k EEC waly bhai ne value badal dali thi.?
when you change the value, will not it affect the information gain value??
It most definitely will.
it will
Sir how to get stops here for drawing tree we are not considered congestion so plz understand it
at 12:45 deep throat Ahahahahhahahaha....
LOOOOL
XD
Tum log mazak uda rhe ho, tumhe pta bhi hai kitni mehnat kagti hai, steep throat ko deep throat na bolne me 😂😂
@@user-gq1ij they had us in the first half ngl
Yeh karne mein toh zindegi nikal jayega
Please make a video on pincer search algorithm. There is no video on the Internet for Pincer Search Algorithm.
O chacha Value jara sahi utara kar! yes ko no mat kario agli baar!
Bhai vo thik hai par , bolne ka ye konsa tarika hai
please add videos of theory part of data mining and data warehousing
He is saying ...see the theory part first and then go for this Numerical part...but I didn't get the theory part of data mining😖...really inappropriate
MI ka phone hai na ye?
Is it Iterative Dichotomizer (ID3) algorithm method?
yes
With no iteration 😂, he calculated gains only once.
Plz tell is it id3?
Agar no se yes ho gaya toh phir gain bhi toh change ho jayega sabka na?
Yes
Came here for the question....later understood that u changed the question itself...😂😂
tmhari wjh se aaj me fail ho gya ...saare formulaas galat pdha diye....bhut bhrosaa tha tjhepe bhaii....bharosaa toddd diyaaa.......
oooo bhai maaro ise maaroooo......
When we have no samples how to form a decision tree?And if attributes list is empty wht we can do
Sir Machine learning topics pr video b bnado please
MACHINE LEARNING
Why taking sore throat in splitting attribute
Last mein minus plus kaise ho gaya?
Headache and congession nai hai
Galat bata dia... Time waste kr dia
Sir aap k notes Kahan milenge?
Hey are u also B.tec student
WHEN YOU DON'T HAVE A CLARITY BEWEEN ENTROPY AND INORMATION GAIN, WHY MAKE SUCH VIDEOS AND MISLEAD THE LEARNERS.. KINDLY AVOID SUCH BLUNDERS
information gain ka formula likhata kuchh or and bad me use kuchh or kiya ha sir apne
the formula is used correctly log 1 base 2 is 0.
agar formula sahi use kia haii toh log n/s 7/10 hona chahiye tha jo ki kavi bhi 1 nhi hota toh better u explain it correctly y he has not use it
It's a WASTE VIDEO, DONT WATCH IT
Aur tumhari entropy 1.52 aa rhi hai
could you please suggest some good book for learning data mining algorithm problem and solution(Theory based)
Data Mining Concepts and Techniques by Jiawei Han
he has the same pattern lock as me
sir apka ploblem ka puraa solution ka notes milaga ? plz bej do description me
Lockdown
abe galat padha rha ,fail ho gaya mai
Sore Throat*
Wrong formulas
Bhai pahile aap sikhlo phir padhana
E(sore throat)= 1.49