One thing you should definitely do is try your Perceptron with new data: you can create another 100 Points, exclude these Points from training and test the Perceptron with these Points...also a very cool way to test you Perceptron is to add a Point at the click of mouse and let it label from the Perceptron (this way if it's wrong you are gonna see a black point inside the white blob). Anyway really good work, i love your way of teach, i love programming, i love machine learning and out of all the videos and blog post and slides that i read you are the one that can really make something easy to understand. Never stop to be like this! ;) P.s. if you need it i made a porting to p5.js with this suggestions implemented. :)
Thanks, I'll do this on tomorrow's live stream! (Hope I remember). If you like, you can pull request the p5 version here: github.com/CodingTrain/Rainbow-Code/tree/master/CodingChallenges
Paolo Ricciuti can you help me out! I am trying to make a game using simple codes but I just can't seem to finish. I have a ball which is being controlled by the keyboard and I am trying to have it so if I go any distance greater than the window it wouldn't allow me so I can't leave the window . Also, I am trying to detect lines, (so if the ball hits the line it loops back to a specific coordinates). Thank you
First impression: 44 minutes? Are you crazy? I'm not going to waste my time on a single video! After done watching: (quietly clicks subscribe button) ....
love that you focus more on the conceptual side of programming more than the nitty gritty details. It's really annoying to see all these other videos that do something like "we have A then we have B, then voila! life!" I've honestly been waiting for a video like this for a long time.
There is a ton of videos that pick out some complex topic and spend 90% of the time explaining something you should have learned in school or something that's rather obvious.
I can't believe this video was released today! I just started working on a problem at work (internship) that needs a (albeit more complicated) neural network to solve! This was a perfect primer to help me really understand the basics! The fit(), train(), and activate() functions is scikit-learn seem far less magical and way more accessible to me now! THANK YOU SO MUCH!
Thank you so much, Daniel. I have never studied coding formally. Started with watching your coding challenges and I'm happy to say that this amazing and simple explanation is exactly what I needed to start my journey into machine learning. You've inspired me and taught me. Thank you.
I love this series so much! Most machine learning / neural network explanations and tutorials are either designed for 5th graders or people with a college degree. The mathematical parts and coding are perfect for a high schools CS student. Thanks so much for finally making me understand backpropogation!
This is exactly what i was looking for! Explaining what is actually happening behind the scenes on the background of ml5 and tensorflow, how it's working. Thank you so much for this!
Dude, I cannot thank you enough for diving into this concept in such an engaging way. It beats trying to break it down from a completely algebraic standpoint
OMG - I am going through Coursera ML course and your video is simply amazing. I cannot thank you with words for offering these videos for free. Love your fun filled way and going through concepts on NN sessions in one at a time.
That was a brilliant intro to neural networks #The Coding Train.... For those who are confused how the multiple iterations happened for the learning process, the key is the draw function which runs in an infinite loop. It took me a while to figure this out. So if you are using something other than processing, then you need to run an infinite loop in your training algo.
I am 17 and only started self-teaching java 2 weeks ago. This helped a lot to get a better understanding of how the language works and also how ML operates (which is something i am very interested in). The fact that you were somehow 100% understandable to me considering the amount of experience i have is actually phenomenal. Also i was surprised to see you are still interacting with comments Subbed and please make more of these
Hey! A big thanks for these lectures. I was always struggling to understand what neural network really is. You made my life simple. Although i code in Python, i love watching your p5 videos Thank You whoever you are!!!
I am highly motivated with your videos ...the way you speak, the humour you have, the knowledge you earned is really... appreciable 👌👏👏 I tries to give PPT in front of mirror like you...😊
Also followed along but doing it in javascript. Cool little project. I trained it on 500 circles and then did a test set of 100. I used frameRate instead of mouse click to slow animation. Occasionally got some weird oscillating behaviour near around 470-480 correct on the training data but usually got it quickly. Also once trained with the training data normally got all 100 test data on one epoch but sometimes got stuck at around 98/99 correct (that only happened around 10% of attempts). I watched the self driving car video and that sent me back to flappy birds and now neural nets before doing flappy birds neuroevolution and then finally back to self driving cars. Once i've a handle of this in javascript i want to do it in something like unity using c#, this will need me to learn blender to make my circuit. Much more fun than watching endless hours of Netflix whilst in lockdown! Happy coding all.
Starting to really dig your videos. This is the first perceptron tutorial to point out that X_0 and X_1 represent x and y, a data point. I've been looking at neural networks for 2 weeks now, and finally I know this distinction! Really helps! Thank you.
If [X_0, X_1] is [x, y] we are inputting a data point from a two dimensional space, correct? If we input [X_0, X_1, X_2], we can say this is equivalent to [x, y, z], which is a data point from three dimensional data space?
Why stop there? We can input many data points at a time right? For example, we can input two vectors, i.e. a two dimensional array: [X_0, X_1, X_2] and [X_3, X_4, X_5]. So now we are inspecting two data points from a three dimensional space. Is this right? Am I using the correct vocabulary here? Thanks!
Yes, you can send any vector of any length into a perceptron, and have that vector represent any data you like. Whether it will do what you want it to do in terms of the intended outputs that's another story and why you might need a "multi-layered perceptron"! (Coming in the next videos if you keep watching.)
This is your first video that I've seen and I must say, I am enthralled by the way you made me understand the Perceptron. Never have I ever seen anyone explain it in such a intuitive, easy, and clear manner. I subscribed immediately and I am gonna go through all your machine learning videos. Thank you very very much. I can't possible explain how much I'm spellbound and impressed by this video. I wish you were my College Professor.
Really nice to see someone writing out the necessary code on the spot, with minimal bells and whistles. Too often it's presented alongside formalism that's much more intimidating than it needs to be.
Amazing explanation. Saved my time for searching this whole stuff. Really a great masterpiece. I need to make project on deep learning in just 3 weeks. Your lectures are helping me. God bless you.
Woow that is amazing video that just lighten me up to go into Machine Learning flow. I am definitely going to try this. I am more interested in watching how you can make its path based on 360* rotation and with some pros and cons implemented like obstacles, or food to eat or stuff like that. I need more insights before i jump into writing bots for my own game which are hard to beat.
watching "The Coding Train" videos and "Andrew Ng" Courses really helps. whenever i get lost i come here to understand the concept better even if he is using Js and Andrew using python , the idea is the same
Really cooooooooooool i love it. I understood many things for my project in january. Later I'll look up Neuron sigmoid model. You're a great explanator !
If anybody still reads the comments, I have a question: 27:12 Why exactly is DeltaWeight = error * input? The error is obvious of course because it's the difference between the correct answer and the guess, but why does it need to be multiplied with the input again?
4:28 Saying the point is at coordinate (x0, x1) is valid and solve the issue 39:00 I think a cool way to visualize the learning process in this particular example would have been to color every pixels of the canvas relative to whether the perceptron got it right. Nice video again though, I'm really looking forward to the next video on this topic =) Cheers
The Coding Train How do you detect a color in an if statement? I am trying to have it so if ellipse/x is greater than (color) the ball loops back to 30
you could make the ellipse an object and make it's colour a this. property, and then use a comparator like if (ellipse.colour > something) { //do something }
Thank you so much for explaining what is going on with a neural network. So many people dive straight into code or go off in their own math world. I'm a little confused on when the looping stops.
long time no see, sign function! i didnt remember that it tells whether a number is negative or positive (as those are signs so yeah sign()) i really like this its easy for me to convert to javascript. thank you so much!
Why do we multiply the error by the input in the "dweight" calculation? I understand went the error is the difference between the guess and the target, but I don't see how that value becomes relevant to updating weights when you multiply it by the inputs. Can anyone help me make the logic jump? 27:22
43:06 you can do this particular example in just 15 lines of MATLAB code with the linprog function: A1 = rand(100,2); A2 = 2*rand(120,2); A1 = A1(sqrt(sum((A1').^2)) < 1.4,:); A2 = A2(sqrt(sum((A2').^2)) > 1.4,:); N1 = length(A1); N2 = length(A2); scatter(A1(:,1), A1(:,2), 'b') hold on scatter(A2(:,1), A2(:,2), 'r') axis equal c = zeros(1,3); A = [-ones(N1,1) -A1; ones(N2,1) A2]; b = -[ones(N1,1); ones(N2,1)]; [x,fval,exitflag] = linprog(c,A,b); ezplot(@(a1,a2) x(1)+a1*x(2)+a2*x(3),[0 2]);
Both you and Siraj are great in so many ways. Both charismatic, both very intelligent, both very funny.. But Siraj doesn't have your talent in teaching and making a solid bridge of information between you as the teacher and us as students through TH-cam as medium.
Hey guys I have a small question here... at line 29 time: 33:02 he writes a for loop that is supposed to train each points and it seems to work BUT if I understand correctly, every points share the same weights and so I wonder if when he trains a point it could make some other points that he had already verified wrong ? Shouldn't he write a second for loop inside of this one to verify if the points before the one he is training are still good ?
While I was doing this with you, almost at the end of the video, processing crashed and i lost the code. Good thing it's the first video and easy to recreate
I'm having trouble understanding for your delta weight formula why you are multiplying the error times the input. On some level it makes sense that it isn't just the error but why the input?
I want a simple explanation as well. The wiki link is too engrossed in math just like this link stackoverflow.com/questions/50435809/perceptron-training-rule-why-multiply-by-x If you have found a simple explanation, please do let us know. I remember from somewhere(might be andrew ng's course) that multiplying by x automatically causes our algorithm to descend towards global optima.
that' incredible. took a couple times lol "guesses a "-1" or "1" as labels" "learns to label the two more accurately for each point, each loop" trippy , Playstation steers us to a better solution . i agree on that one
Why is the new weights, multiplied with the value of the input, in the train function? If you pass it a point very close to (0,0), it will be harder to train it, right?
Just question, When you are continuously tuning weights when guess of perceptron does not match as per expected output, Can this delta change of weights negatively impact the previous inputs for which error was zero and guess was matching expected output. In other words, is it impossible to have ideal value of weights where error becomes zero for all the provided training data.
One thing you should definitely do is try your Perceptron with new data: you can create another 100 Points, exclude these Points from training and test the Perceptron with these Points...also a very cool way to test you Perceptron is to add a Point at the click of mouse and let it label from the Perceptron (this way if it's wrong you are gonna see a black point inside the white blob). Anyway really good work, i love your way of teach, i love programming, i love machine learning and out of all the videos and blog post and slides that i read you are the one that can really make something easy to understand. Never stop to be like this! ;)
P.s. if you need it i made a porting to p5.js with this suggestions implemented. :)
Thanks, I'll do this on tomorrow's live stream! (Hope I remember). If you like, you can pull request the p5 version here: github.com/CodingTrain/Rainbow-Code/tree/master/CodingChallenges
The Coding Train Thank's you for your great way of teaching! ;)
Paolo Ricciuti can you help me out! I am trying to make a game using simple codes but I just can't seem to finish. I have a ball which is being controlled by the keyboard and I am trying to have it so if I go any distance greater than the window it wouldn't allow me so I can't leave the window . Also, I am trying to detect lines, (so if the ball hits the line it loops back to a specific coordinates). Thank you
Paolo Ricciuti How do you detect a color in an if statement? I am trying to have it so if ellipse/x is greater than (color) the ball loops back to 30
cool........can I get a hold of that ?
your speaking skills, your interactiveness, your teaching ability...best online teacher I've ever met
*mated
18:05 that pen flip was so smooth lmfao
First impression: 44 minutes? Are you crazy? I'm not going to waste my time on a single video!
After done watching: (quietly clicks subscribe button) ....
Haha that was me.
I didn't even realise that his video was 44 min .. I just checked after reading your comment
True 😂
"How to train your Perceptron" (2017)
IMDb 8.7/10
I'd watch.
Efferto93 only an 8.7? You better change that to a 10/10
haters gonna hate man
@@Toochilledtocare-_- it will be in a bit
Has a little something for everyone
love that you focus more on the conceptual side of programming more than the nitty gritty details. It's really annoying to see all these other videos that do something like "we have A then we have B, then voila! life!" I've honestly been waiting for a video like this for a long time.
Appreciate the feedback, thank you!
There is a ton of videos that pick out some complex topic and spend 90% of the time explaining something you should have learned in school or something that's rather obvious.
who ever you are thank you very much ....I love watching all your video and learn new stuffs
hmnnn....who r u?
who
Don't tell this secret to anyone ,he is batman
I'd recommend this to everyone over any movie
MukulNegi Awesome comment
I agree w jahmahent. This would
This is the clearest explanation on machine learning that I have ever watch.
I can't believe this video was released today! I just started working on a problem at work (internship) that needs a (albeit more complicated) neural network to solve! This was a perfect primer to help me really understand the basics! The fit(), train(), and activate() functions is scikit-learn seem far less magical and way more accessible to me now!
THANK YOU SO MUCH!
Keyboard Bandit - Did you solve your problem? 😁
Thank you so much, Daniel. I have never studied coding formally. Started with watching your coding challenges and I'm happy to say that this amazing and simple explanation is exactly what I needed to start my journey into machine learning. You've inspired me and taught me. Thank you.
The suspense built by the code and then watching it all work so well was amazing. It was also a very good idea to show the progress with each click.
hah, glad to hear this feedback!
I love this series so much! Most machine learning / neural network explanations and tutorials are either designed for 5th graders or people with a college degree. The mathematical parts and coding are perfect for a high schools CS student. Thanks so much for finally making me understand backpropogation!
This is how powerful processing can be in understanding a concept. Great video Dan as always.
This is exactly what i was looking for! Explaining what is actually happening behind the scenes on the background of ml5 and tensorflow, how it's working.
Thank you so much for this!
Dude, I cannot thank you enough for diving into this concept in such an engaging way. It beats trying to break it down from a completely algebraic standpoint
OMG - I am going through Coursera ML course and your video is simply amazing. I cannot thank you with words for offering these videos for free. Love your fun filled way and going through concepts on NN sessions in one at a time.
You are the best teacher I've ever seen. Thanks for sharing your knowledge
you are the only person on youtube who actually practically shows how this all works. actually coding it.
I have become accustomed to listening long lecture videos at ~2x speed. And watching your videos at 1.75x hits the sweet spot.
That was a brilliant intro to neural networks #The Coding Train.... For those who are confused how the multiple iterations happened for the learning process, the key is the draw function which runs in an infinite loop. It took me a while to figure this out. So if you are using something other than processing, then you need to run an infinite loop in your training algo.
I am 17 and only started self-teaching java 2 weeks ago. This helped a lot to get a better understanding of how the language works and also how ML operates (which is something i am very interested in). The fact that you were somehow 100% understandable to me considering the amount of experience i have is actually phenomenal. Also i was surprised to see you are still interacting with comments
Subbed and please make more of these
Glad to hear!
Thank you Daniel, that really helped me to get into the topic. I like the idea "if you can code it, you kind of understand it".
Hey! A big thanks for these lectures. I was always struggling to understand what neural network really is. You made my life simple. Although i code in Python, i love watching your p5 videos
Thank You whoever you are!!!
I am highly motivated with your videos ...the way you speak, the humour you have, the knowledge you earned is really... appreciable 👌👏👏
I tries to give PPT in front of mirror like you...😊
Also followed along but doing it in javascript. Cool little project. I trained it on 500 circles and then did a test set of 100. I used frameRate instead of mouse click to slow animation. Occasionally got some weird oscillating behaviour near around 470-480 correct on the training data but usually got it quickly. Also once trained with the training data normally got all 100 test data on one epoch but sometimes got stuck at around 98/99 correct (that only happened around 10% of attempts). I watched the self driving car video and that sent me back to flappy birds and now neural nets before doing flappy birds neuroevolution and then finally back to self driving cars. Once i've a handle of this in javascript i want to do it in something like unity using c#, this will need me to learn blender to make my circuit. Much more fun than watching endless hours of Netflix whilst in lockdown! Happy coding all.
Starting to really dig your videos. This is the first perceptron tutorial to point out that X_0 and X_1 represent x and y, a data point. I've been looking at neural networks for 2 weeks now, and finally I know this distinction! Really helps! Thank you.
If [X_0, X_1] is [x, y] we are inputting a data point from a two dimensional space, correct? If we input [X_0, X_1, X_2], we can say this is equivalent to [x, y, z], which is a data point from three dimensional data space?
Why stop there? We can input many data points at a time right? For example, we can input two vectors, i.e. a two dimensional array: [X_0, X_1, X_2] and [X_3, X_4, X_5]. So now we are inspecting two data points from a three dimensional space. Is this right? Am I using the correct vocabulary here? Thanks!
Yes, you can send any vector of any length into a perceptron, and have that vector represent any data you like. Whether it will do what you want it to do in terms of the intended outputs that's another story and why you might need a "multi-layered perceptron"! (Coming in the next videos if you keep watching.)
Thanks! I'm taking the Udacity Machine Learning Engineer course, but I find supplementing it with your videos clears up a lot of questions.
This is your first video that I've seen and I must say, I am enthralled by the way you made me understand the Perceptron. Never have I ever seen anyone explain it in such a intuitive, easy, and clear manner. I subscribed immediately and I am gonna go through all your machine learning videos. Thank you very very much. I can't possible explain how much I'm spellbound and impressed by this video. I wish you were my College Professor.
Such nice feedback, thank you!
Congratulations!! You have explained very well the basic of perceptron model.
Really nice to see someone writing out the necessary code on the spot, with minimal bells and whistles. Too often it's presented alongside formalism that's much more intimidating than it needs to be.
Good day - I noticed your Nick Cave t-shirt. I have loved Nick Cave ever since I was a young guy - early 1980s.
Thanks for your videos - very helpful.
I watched some videos way back when the channel started, must say you've definitely improved the format and pacing. Great work!
i don't understand java but still i got 100% content from your video.
it is the power of skillfull teacher.
i can't stop watching your videossss, you're a great teacher !!
I dont understand java but this is still the best video and clearer than the ones for languages ive used before
Great video! Neural networks can be confusing but I like that you started with a simple example.
Mark Jay spotted
Love your videos
Amazing explanation. Saved my time for searching this whole stuff. Really a great masterpiece. I need to make project on deep learning in just 3 weeks. Your lectures are helping me. God bless you.
this is the most diverse course man, he starts off with JS, now Java and i belive he`ll show python, on the same concept, this is acctualy great
I'm having +fun!
As always, great video. Your energy and enthusiasm is contagious! Please never stop being so amazing.
Thank you sir, the best place to learn the explanation of NNs in most simplest way. Keep up the good worker.Looking for more videos by Sir
All i had was positive fun the whole video.
It´s hard to see this man videos without smiling
Woow that is amazing video that just lighten me up to go into Machine Learning flow. I am definitely going to try this. I am more interested in watching how you can make its path based on 360* rotation and with some pros and cons implemented like obstacles, or food to eat or stuff like that. I need more insights before i jump into writing bots for my own game which are hard to beat.
💯 Someone needs to offer this man his own show to reach more people.
watching "The Coding Train" videos and "Andrew Ng" Courses really helps.
whenever i get lost i come here to understand the concept better
even if he is using Js and Andrew using python , the idea is the same
You have a very engaging instruction style and explained this concept very well. Great video.
Thank you!
Really cooooooooooool i love it. I understood many things for my project in january. Later I'll look up Neuron sigmoid model. You're a great explanator !
Amazing video! You are just a genius to kick the idle mind of us who have average mind wishing to fight for the most challenging things. KUDOS!!!
@23:20 +Fun increases your fun potential, -Fun spends it. It's like climbing a slide, you have fun climbing, you got even more sliding it down!
If anybody still reads the comments, I have a question:
27:12 Why exactly is DeltaWeight = error * input? The error is obvious of course because it's the difference between the correct answer and the guess, but why does it need to be multiplied with the input again?
I am taking a ML course now with the most horrible lectures imaginable. This is a godsend!
I'm watching this a year later and I'm still having plus fun. Not negative fun.
4:28 Saying the point is at coordinate (x0, x1) is valid and solve the issue
39:00 I think a cool way to visualize the learning process in this particular example would have been to color every pixels of the canvas relative to whether the perceptron got it right.
Nice video again though, I'm really looking forward to the next video on this topic =)
Cheers
Finally somebody who explained it the right way
Great work as always Dan, can't wait for full Neural Networks!
Thoroughly enjoyed it thank you! I use to learn processing from you and now Neuron Network 👍
Another great video 🤗👌
Thank you!
^^^
The Coding Train How do you detect a color in an if statement? I am trying to have it so if ellipse/x is greater than (color) the ball loops back to 30
you could make the ellipse an object and make it's colour a this. property, and then use a comparator like
if (ellipse.colour > something) {
//do something
}
You deserve a gold medal...
You have the best tutorial for neural networks!!
Finally some processing, much appreciated
I love you so much for your tutorials!
Very interesting use of random in your code examples. Thank you.
Excellent explanation and demonstration. Keep it up 🙏👌
This is great. Thanks for breaking it down into simple form!
If I had 15ft of pure white snow I’d watch every single video you have published
Soooooo cool: Professor Shiffman is wearing a Nick Cave and the Bad Seeds t-shirt! Yeah!!!
18:08 *successfully throws are catches marker* 10/10
Best explanation I found on NNs. Thank you!
YASS! I finally understood what all of this meanssss.... Thank you Mr. Shiffman. Subbed right away
Thank you so much for explaining what is going on with a neural network. So many people dive straight into code or go off in their own math world. I'm a little confused on when the looping stops.
You are really good at the art of teaching :)
18:22 That is not the line y=x. Rather, it corresponds to the other diagonal, in the computer's axial orientation.
Great! I just started this topic in my IA course at university.
Thank you!
You are the best, end of the discussion.
Yes, awesome, finally onto a perceptron!
Keep Going!
Is it just me or everyone else LOLs while watching this guy but still learns a ton from listening . Coolest teaching style
Damn it! I watched it till the end. Very good explanation 👍
Finally find the u-tuber that I was looking for..!!! 😄
Awesome as always Dan.....btw I LOVE Nick Cave and the Bad Seeds !!!
Thanks!
Thank you for the support!
How can I be so proud of a cell that is doing the job OF AN IF STATEMENT ?
Neural Netw... Perceptron Power !
long time no see, sign function! i didnt remember that it tells whether a number is negative or positive (as those are signs so yeah sign()) i really like this its easy for me to convert to javascript. thank you so much!
27:21 why the input is multiplied by the error value? I didn't get that part. Can anyone please help?🙏
Why do we multiply the error by the input in the "dweight" calculation? I understand went the error is the difference between the guess and the target, but I don't see how that value becomes relevant to updating weights when you multiply it by the inputs.
Can anyone help me make the logic jump? 27:22
43:06 you can do this particular example in just 15 lines of MATLAB code with the linprog function:
A1 = rand(100,2);
A2 = 2*rand(120,2);
A1 = A1(sqrt(sum((A1').^2)) < 1.4,:);
A2 = A2(sqrt(sum((A2').^2)) > 1.4,:);
N1 = length(A1);
N2 = length(A2);
scatter(A1(:,1), A1(:,2), 'b')
hold on
scatter(A2(:,1), A2(:,2), 'r')
axis equal
c = zeros(1,3);
A = [-ones(N1,1) -A1; ones(N2,1) A2];
b = -[ones(N1,1); ones(N2,1)];
[x,fval,exitflag] = linprog(c,A,b);
ezplot(@(a1,a2) x(1)+a1*x(2)+a2*x(3),[0 2]);
I think nobody noticed but you were using a nick cave and the bad seeds t shirt hahah, great artist
Both you and Siraj are great in so many ways. Both charismatic, both very intelligent, both very funny.. But Siraj doesn't have your talent in teaching and making a solid bridge of information between you as the teacher and us as students through TH-cam as medium.
Thank you so much for step by step simple explanation. Was very descriptive.
Thank you for the nice feedback!
I love at 35:00 the horror in his face when he realizes his code works :D We all been there before buddy believe me.
Hey guys I have a small question here...
at line 29 time: 33:02 he writes a for loop that is supposed to train each points and it seems to work BUT if I understand correctly, every points share the same weights and so I wonder if when he trains a point it could make some other points that he had already verified wrong ?
Shouldn't he write a second for loop inside of this one to verify if the points before the one he is training are still good ?
You are an excellent teacher!
I love how you naturally write p5 instead of pt
While I was doing this with you, almost at the end of the video, processing crashed and i lost the code. Good thing it's the first video and easy to recreate
I'm having trouble understanding for your delta weight formula why you are multiplying the error times the input. On some level it makes sense that it isn't just the error but why the input?
It is explained here: en.wikipedia.org/wiki/Delta_rule
Have the same problem, just doesn't explain to me why I add the input too..
I want a simple explanation as well. The wiki link is too engrossed in math just like this link stackoverflow.com/questions/50435809/perceptron-training-rule-why-multiply-by-x
If you have found a simple explanation, please do let us know. I remember from somewhere(might be andrew ng's course) that multiplying by x automatically causes our algorithm to descend towards global optima.
that' incredible. took a couple times lol "guesses a "-1" or "1" as labels" "learns to label the two more accurately for each point, each loop" trippy , Playstation steers us to a better solution . i agree on that one
i learned so much so thank you a lot !
I L-O-V-E your teaching! Thank you so much, I finally get it!
Actually first AI tutorial that I understood
Why is the new weights, multiplied with the value of the input, in the train function?
If you pass it a point very close to (0,0), it will be harder to train it, right?
Just question, When you are continuously tuning weights when guess of perceptron does not match as per expected output, Can this delta change of weights negatively impact the previous inputs for which error was zero and guess was matching expected output. In other words, is it impossible to have ideal value of weights where error becomes zero for all the provided training data.
formula y = total of Xi+Wi +b. I think in your video you didn't mention Bias? so Bias is compulsory to add or not?