Traditional Statistics vs Machine Learning

แชร์
ฝัง
  • เผยแพร่เมื่อ 20 ก.ค. 2024
  • #datascience #statistics #machinelearning #ai
    What is the difference between machine learning (ML) and traditional statistics? How did they historically develop? And is artificial intelligence (AI) something else? When can I use which? And how should I go about learning these methods, for example if I want to become a data scientist? These are some of the questions that Ben Geisler poses to Professors Ulrich Mansmann and Ludwig Christian Hinske of LMU’s IBE.
    0:25 Differences between ML, traditional statistics and AI
    4:24 Differences between deep learning and ML?
    5:20 Reinforcement learning
    6:34 Are there instances where ML/AI/deep learning should not be used?
    7:44 Explainable AI/ML
    11:06 When is ML/AI/deep learning most useful?
    13:21 I have very little data - can I use ML?
    16:33 Degree program vs MOOC
    References:
    -L. Breiman: Statistical Modeling: The Two Cultures. Statistical Scince 2021 bit.ly/35bnBcT
    -E. Christodoulou et al.: A systematic review shows no performance benefit of machine learning over logistic regression for clinical prediction models. J Clin Epidemiol 2019 bit.ly/3HGJyhb
  • วิทยาศาสตร์และเทคโนโลยี

ความคิดเห็น • 23

  • @ibe-munich
    @ibe-munich  ปีที่แล้ว +4

    Prof. Dr. Mansmann intetrviews Karl Lauterbach *Federal Minister of Germany for Health*
    th-cam.com/video/A_JVmczH2t4/w-d-xo.html

  • @vlemvlemvlem3659
    @vlemvlemvlem3659 5 ชั่วโมงที่ผ่านมา

    Super interesting. Loved it. In my own work I tend to prefer a statistical approach for reasons of privacy (personal data) and explainability. The second part I would have loved the discussion to have drilled down on: what’s the value of outcomes without understanding? In my field I must be able to explain how I derive results. If I were to use ML I’d be able to scale but without understanding how I generate results.

  • @arturoone77
    @arturoone77 ปีที่แล้ว +13

    Thank you very much for uploading this discussion. I greatly appreciated this as someone with a background in econometrics, who couldnt really grasp how machine learning is not just "spicy statistics". It can be confusing for me, as classical statistical concepts are often redefined in machine learning with different names, but the power of computers is undeniable in dealing with complex large datasets.

  • @Maceta444
    @Maceta444 ปีที่แล้ว +23

    Machine Learning ~ Statistics + Code with a flowery name to make it sound like some deep shit

  • @daveking-sandbox9263
    @daveking-sandbox9263 ปีที่แล้ว +8

    Schach Türke Is a “Chess Turk” in English, not a chess Turkey. Turkey is the name of the country but a Turkish person is a Turk in English. Danke!

    • @ibe-munich
      @ibe-munich  ปีที่แล้ว +3

      Thanks for the clarification Dave

  • @PrinceKumar-hh6yn
    @PrinceKumar-hh6yn ปีที่แล้ว +4

    Beautifully discussed

  • @liambaldwin6823
    @liambaldwin6823 ปีที่แล้ว +1

    This is a really great video. Very well done.

    • @ibe-munich
      @ibe-munich  ปีที่แล้ว +1

      Thank you very much!

  • @emmanuelameyaw9735
    @emmanuelameyaw9735 8 หลายเดือนก่อน

    About legal aspect, it is also true for statistical models. If you use logistic model to make a decision and is wrong and costly, who should be blamed?

  • @joefish6546
    @joefish6546 ปีที่แล้ว +2

    I'm struggling to understand how occums razor applies to neural networks. For example, I thought the process of model selection should favor the simplest combination of variables e.g. AIC. But neural nets seem to assume the opposite, that large combinations of parameters model complexity well so are a good thing.

    • @macx7760
      @macx7760 10 หลายเดือนก่อน +1

      well for complex problems not having enough parameters will induce a strong bias in a model, so that is something you dont want either.

    • @jamesdavis3851
      @jamesdavis3851 7 หลายเดือนก่อน +1

      There's a sweet spot for number of parameters in any model, NNs included. "Overfitting" is basically industry jargon for overly complex modeling. I've seen models overly parameterized or overly complicated frequently beaten by simpler ones. In any case, Occum's razor doesn't *have* to apply to anything... it's just an aphorism, not a law of physics.

    • @joefish6546
      @joefish6546 2 หลายเดือนก่อน

      @@jamesdavis3851 and @macx7760 thank you both for responding. My perception is that these neural network modelling methods really just create over-parametrized models with high predictive power at the expense of explanatory power. However, I now think that AI engineers are primarily (and maybe even exclusively) directed to find tools with predictive power and don't care about explanation, while classical statistics was developed by scientists trying to ask and answer 'why/how' questions. In other words, neural network algorithms are developed by engineers wanting to 'do' and the explanation doesn't really matter if you are trying to better sell advertising, or build a walking robot, or land a rocket, etc.

    • @jamesdavis3851
      @jamesdavis3851 2 หลายเดือนก่อน

      @@joefish6546 That's kinda what "engineer" implies - applications/solutions, not research or explanations.
      But also, getting solutions by any means still potentially gives insights to the underlying system. Sometimes the goal is to understand, and insight can come from anywhere... but I don't disagree that the primary goal of any engineer is to solve.

  • @pichirisu
    @pichirisu 7 หลายเดือนก่อน

    The only real discussion that's (not) being had here is how the true "problem" is that of time-required for problems. Anything else is just jargon and conjecture.

  • @nottheone582
    @nottheone582 ปีที่แล้ว +1

    at 5:20 your Reinforcement title has a typo

    • @BenGeisler
      @BenGeisler ปีที่แล้ว

      Oops.. I guess too late to change it now 😮

    • @tim40gabby25
      @tim40gabby25 ปีที่แล้ว +2

      So fingers crossed you don't lose 100B in the next week :)

    • @BenGeisler
      @BenGeisler ปีที่แล้ว +1

      Consider it a coding error,@@tim40gabby25