Very true those are underused tools, and the systems to use them are waaaaaaaaaaaaaaay to NOT userfrendly. Make a tools that is extreamly easy to use meaning almost no buttons (but all the buttons and functionalitys hidden in the settings but very google like very chatGPT like in the sense of minimalist design
:) ,as data scientist I was unaware,It's about ' Why ' not 'how' , really Powerful tool :) agreed, genuinely if v want to learn in depth n want to b more specific or expert in research n analysis ,then study with by questioning 'why' approach can makes unique n huge impact in learning,. Nice presentation 👌 :), but In real life in some matters I always prefer to avoid it
That 10 000 is enought to distinguish. You just have more amount of data and less filtered than if you have 1000000000 million but the amount of data points will be the same in the end. Example: only 10'000 people = 200'000 datapoints per person but if you have 2'000'000'000 people you might only add 2 sizes of datapoints both equal the same number of stuff for the neural network, the large dataset can just be less precise and probably more generalizable but should be possible for sure, i am not convinced as many datapoints as today are required are needed! If i get rich ill advance that field/prove it (most likely if not already solved etc.)
Underrated talk! Thank you so much.
Excellent talk! Thank you very much.
Patterns, patterns, patterns!!
Very true those are underused tools, and the systems to use them are waaaaaaaaaaaaaaay to NOT userfrendly. Make a tools that is extreamly easy to use meaning almost no buttons (but all the buttons and functionalitys hidden in the settings but very google like very chatGPT like in the sense of minimalist design
Awesome!
:) ,as data scientist I was unaware,It's about ' Why ' not 'how' , really Powerful tool :) agreed, genuinely if v want to learn in depth n want to b more specific or expert in research n analysis ,then study with by questioning 'why' approach can makes unique n huge impact in learning,. Nice presentation 👌 :), but In real life in some matters I always prefer to avoid it
That 10 000 is enought to distinguish. You just have more amount of data and less filtered than if you have 1000000000 million but the amount of data points will be the same in the end. Example: only 10'000 people = 200'000 datapoints per person but if you have 2'000'000'000 people you might only add 2 sizes of datapoints both equal the same number of stuff for the neural network, the large dataset can just be less precise and probably more generalizable but should be possible for sure, i am not convinced as many datapoints as today are required are needed! If i get rich ill advance that field/prove it (most likely if not already solved etc.)
Is "the mean" white?