Yoshua Bengio: Deep Learning Cognition | Full Keynote - AI in 2020 & Beyond

แชร์
ฝัง
  • เผยแพร่เมื่อ 17 ธ.ค. 2024

ความคิดเห็น • 12

  • @somjitamrit9821
    @somjitamrit9821 2 ปีที่แล้ว

    A wonderful presentation . Biological intelligence , mathematical intelligence and mindfulness have been all brought together in a way which brings out a lot of sense and in a way it says how far artificial intelligence is far from human intelligence , and what needs to be done to bridge the gap .

  • @DrSirmacek
    @DrSirmacek 4 ปีที่แล้ว +9

    I am amazed to hear how he defined the definition of attention. That alone has potential to open up a new stage in AI.

    • @abcdxx1059
      @abcdxx1059 4 ปีที่แล้ว +3

      he's is just on another dimension his thinks in only i terms of general intelligence and not just getting a few extra points on imagenet

    • @MrAlextorex
      @MrAlextorex 4 ปีที่แล้ว +2

      if you go to neuroscience conferences you'll see from where he got the ideas

    • @psychastheneia7
      @psychastheneia7 3 ปีที่แล้ว

      @@MrAlextorex I'm in the job market with computational background and a PhD in systems neuroscience. I am applying to industry jobs and the amount of jobs that specify neuroscience as a relevant degree is Infinitesimal. People still don't realize the extent of inspiration that Machine Learning draws from Neuroscience. And they don't realize that every AI revolution had psychologists and neuroscientists closely attacked

    • @wstein389
      @wstein389 ปีที่แล้ว

      It worked

  • @franktfrisby
    @franktfrisby 4 ปีที่แล้ว +1

    I agree with a lot of what Yoshua is explaining. Even moving away from vectors and into using sets that implies attention. Great talk!

  • @lawofuniverserealityanalyt3199
    @lawofuniverserealityanalyt3199 4 ปีที่แล้ว +4

    Thank you Yoshua for an interesting and very informative presentation; I was amazed to find how much your comprehensive ideas for making progress in Deep Learning resonates with an intelligent network simulation model for transportation where the network is the world model representing the actual (abstract) space; much like the neo-cortex where thoughts and reasoning is considered to happen (Kurzweil); William James talks about streams of thoughts (flow of electrons/moving agents) strength of flow is the capacity of synapses in the embedded network of neurons. A neuron can be as complex as a transportation network (David Eagleman). Learning from the integral of flow (path) heading to an End equals probability to the End point given a choice (Stephen Hawking); relates to synapse plasticity; composed by combination of prior probabilities and causal parameters as factors (Bayesian) minimizing errors. Note, a neuron directed towards an end is a construction which creates all indifferent paths simultaneously given the learned parameters by use of the implicate order; pure bagprop which in a way illus trait what intelligens is. These ideas lead to a global transport mind which may create the future in a rational way for the benefit of all travellers. Best wishes Niels Hoffmann PaxAssign Limited, Skotland.

  • @Chess_Intelligence
    @Chess_Intelligence 4 ปีที่แล้ว +2

    Very informative talk. Nice job.

  • @riagayo4174
    @riagayo4174 4 ปีที่แล้ว +1

    I need a causation-based one!

  • @444haluk
    @444haluk 3 ปีที่แล้ว

    There are fast and slow unconscious things. Slow unconscious things: "the tip of my tongue", who recalls that if you couldn't? Creativity, aside from playing with things what do you do actually to come up with an idea? They all happen unconsciously, and slow. So the "renaming" step has serious issues.

  • @aditya169
    @aditya169 4 ปีที่แล้ว +1

    😮