Lecture 1 | The Perceptron - History, Discovery, and Theory
ฝัง
- เผยแพร่เมื่อ 27 พ.ย. 2024
- Carnegie Mellon University
Course: 11-785, Intro to Deep Learning
Offering: Fall 2019
Slides: deeplearning.cs...
For more information, please visit: deeplearning.cs...
Contents:
• Course Logistics - ภาพยนตร์และแอนิเมชัน
This was absolutely brilliant. A masterclass in lecture content design.
Very well pieced together -> great flow -> Wow moment towards the end -> evokes a lot of curiosity
This series is the best lecture series on Deep learning. Gone through lots of lectures but nothing like this. So comprehensive. So insightful. Provides in-depth perspective. Thanks a lot Carnegie Mellon University for making such a great content publicly available.
Piece of advice. Go through all the slides of that lecture before going through that lecture. You will be really amazed by how much more you will be able to grasp than before.
The professor is great! I really enjoyed his lectures. I really appreciate his ability to convey the information and materials for the class.
There were some minor inaccuracies around neuroscience, which may have been correct a few decades ago mind you, but overall the lecture was quite good.
Very good, but a commentary for 37:29. You would be correct at the time of this statement, but a year ago there was a bombshell neuroscience paper published in the Harvard journal of medicine which discovered the axon would also receive information, which somewhat damages the analogy but may in itself be insightful.
Loving this series! Such a talented lecturer.
Best introduction to deep learning i've seen so far.
Around 38:40 it is claimed that the human brain does not grow any new brain cells. The phenomenon 'neurogenesis' disproves that claim. In fact, as I understand it, we constantly throughout our lives create new brain cells, especially through rigorous exercise followed by mental stimulation. Please correct me if I am mistaken.
I think you should be mistaken
. He is very experienced prof at CMU
I dought on him
May be you are true
I don't know for sure
I hope I can take a lecture someday as a student at CMU's...
Great!
Starting was boring but the last 30 mins awesome
why the neural network have hidden layers, if it is not hidden then we would know all the parameter from which factors our model is learning, how the model is learning then we would have more control on it,more understanding about its decision, even we can help networks to perform better by manually giving weights to some hidden layer neutron which we are confident about
Why are pages 66 and 67 skipped?. I would like to know about memory in loops
Done
Great lecture!
Great Lecture...
He says before 2016 SIRI and other systems “pretended” to do speech recognition and their results were a joke. Not really. Speech recognition was already pretty good. I had been using it and I was happy. Perhaps he looked at speech recognition prior to around 2010 and mistakenly assumed it remained at the same level until 2016.
Awesome!
"more fat means more smart"
eats while watching this :)
Nice lecture.
SHUT UP FAT BOY
interesting talk, now, for the sake of sanity, stop moving the camera!