hahahahahaha that's my favourite comment so far, Tobs!!!! 🤣🤣🤣🤣🤣 Before you know it - 10 armored SUV with blackened windows park beside my building demanding clarification 😂
This is awesome Ma, before watching this! spend several hours to understand the loss function but finally in less than 12 minutes I grasp the whole concept. Next should be cross validation Thanks in million From Nigeria
A video on back propagating the cross entropy loss via code would be super helpful for how the weights and bias get updated. This video was def helpful in understanding the forward propagation
I'm a very beginner and I though I had completely understood the concept until I watched your video, hehehe... Thank you very much! The quality of your content is unvaluable, since the edition, the colors, the whiteboarding until the whole information and examples! My favourite YT channel 😭 thanks again! Pd. I also used the navbar on the right in my computer 😅👏🏽
Thank you so much for the incredible comment, Gaby!! I'm super happy you enjoy my tutorials!!! 😀😀😀 I find that the best way to understand ML algorithms is to take a pen and a piece of paper and do it manually without any computers involved 😉 Take the most basic example input such as: [0.2, 0.4, 0.1] and trace it layer by layer until all the formulas settle in and become intuitive.
I must mention that Numpy beats the math module in a heartbeat!! 😉 So you may have done a very wise decision not familiarizing with it hahaha 🤣 It's definitely nice if you're doing simple tasks such as .log10() - but Numpy rules when it comes to complex calculations!
You introduce Loss Function @1:35 with an example of distance from Squamish to Vancouver. That depends on how many routes are involved. If there's only one route, it's like a number line. If there are multiple routes, shortest, longest, and average can be calculated. But you introduce a Loss Function formula @3:39 that does not reflect that first example. What kind of math background is needed for beginners to unpack this formula that (I think) many find as not intuitive?
Thank you so much Alfredo, I couldn't agree more!!! 😁 (even though, I never imagined I'll have 40,000 lovely people watching my tutorials! 😉 It's still shocking to me!)
Yep, incredibly clear and we'll paced. Incredible teacher. Especially considering English isn't her native tongue...but I suspect she has a greater grasp of English than most English speakers!
Thank you so much Abhinav! Welcome aboard! 😁 I actually already have a bunch of videos about Machine Learning and I'll definitely be filming many more - checkout my playlist 😉: th-cam.com/play/PLqXS1b2lRpYTpUIEu3oxfhhTuBXmMPppA.html
I like the way you make explain this concepts. It isn't easy but you're a very good teacher. Again, congrats for your goal and congrats for never surrender. 🧡
Thank you so much for the incredible feedback!! 😁 trying to do my best to make Machine Learning more approachable to as many lovely people as possible! 😀
might be the best channel i familiar with that as such a simplified and graet content. like all those visual effects and accessories. it looks like you put some serious effort into those vids. hope you will be reworded as you deserve and I will try to spread the rumor :)
hahahaha thank you! 😀 there are actually more of us out there then you may think 😉 Only in my close family circle there are 3 (well, including myself 😊)
😇 I Feel lucky that i found your channel, very impressed the way it is explained. Thanks a lot for video, hope that all part of machine learning concepts and concept related to data science will be comming in upcomming videos. I have a gratitude towards your dedication.
Thank you so much, it was one definitely the most complex videos I've filmed so far so I'm super glad you liked it! 😀 Apparently the nail polish ended up matching the colour of the background graphics hahaha
Nice conceptualization :D In real ML problems though, with bigger datasets, you will want to vectorize the for loop to compute the loss, either using tensors (pytorch, tensorflow...) or numpy. It will be much faster.
very good video; very good series; very beautyFul smile! ...UR taking an arguAbly complex subject &, I must humbly opine, made it much more approachAble for most
Yeeeeey!!! I'm so happy to hear that!!! 😁😁😁 I've struggled so long with finding simple information about AI & Machine Learning a few years back and it was extremely discouraging! When you're passionate about something - nothing should prevent you from exploring and expanding your knowledge, but it sure can take much longer when the information is intentionally complex. You totally got it, my friend! my goal is to convey everything that took me 2 years to learn in much less time! I'm really hoping to make other people's AI journey friendlier than mine 😅
@@PythonSimplified God Bless. my journey into ML (& Py) haz been 'forced' by virtue of my primary project of craft'ng a 'better' (alt~) Script [glyphSet] for the wRit'ng English [& other languages] more 'efficiently' - though someWhat subjective, I will soon publish (a) book(s respect'ng that project; I'd be delight'd to then [later this year] gift a copy of my book(set 2U.
Thank you so much Sagar! 😃 I don't believe there's a strict rule to this - it usually depends on the data you're working with and what outcome you're trying to achieve. A good solution is probably to test few different threshold values on a small fraction of your data and select the best one depending on the outcome. Good luck and I hope it helps 😊
I have a question. Why does she directly insert the weighted sum into the cross entropy loss function and not the activation function or the output of that sum? For me it works with both the weighted sum and the sum after passing the sigmoid activation function. I don't know which is better or correct... Update: For me using "y hat" instead of the weighted sum works better
Hi Abhishek! 😀 There should be as bunch of existing software that can do it... or did you mean voice recognition? Like analyzing the sound and classifying it or converting it into strings?
There are some really nice books out there, but I can't say that I followed one of them... I accumulated lots of information over the years and the more different angles you are exposed to - the better you understand! With that said, my favorite work regarding the Perceptron is: hagan.okstate.edu/4_Perceptron.pdf It's by Martin Hagan of the Oklahoma State University and it's absolutely fantastic!!! I really wish I found it earlier, it would have saved me lots of research 😉
OK, my brain went into shutdown halfway through this ;) I think you lost me at logarithms. But you're right: things I left behind in highschool are scary - but not unfathomable. I mean, I did pick up trigonometry after 20+ years - and now I'm the happiest little monkey, programming 3D spheres in CSS and Archimedean spirals in SVG. Both of which are vain, futile, totally unnecessary and utterly impractical - but I got over my video game addiction because this is just more fun :D
You usually start with random weights and then you adjust them with Gradient Descent 😃 So you would usually create a data structure with Numpy, in the shape of your weights and then you fill it with random values. I demonstrate the entire training process, including Gradient Descent, in another tutorial of mine: ⭐️ Train a basic neural network with NumPy and Pandas: th-cam.com/video/xpPX3fBM9dU/w-d-xo.html I hope it helps! 😁
Hey! Just wanted to ask a quick question. Do you think you could make a video about making spectrograms from data sets? I'm going into SETI and Astrobiology work and I will be looking at radio signals and other frequencies and turning the data into a spectrogram so you can see the data more clearly. If not its alright, thanks love the videos!
This definitely sound super interesting! I've been fascinated with audio processing for a while but I must admit - I don't believe I have enough knowledge at this point of time to properly explain amplitude and frequencies 😅 I am definitely planning to dive in there in the future, but I must learn it first before I can move on with teaching... So far, I've briefly worked with sound in P5.Js, where I did create spectrograms for a University thing, but I don't believe it's advance enough to be helpful in your situation, but definitely let me know if you want to have a peek - I can always upload it to my Github 😉
@@PythonSimplified oh I see hahaha, that's alright!! Its definitely a very specific topic. I appreciate the feedback and response and I would love to check it out on github :)
I just come from the swimming pool and just realized that I missed the premiere :/. However congrats on 40k Queen!!! You deserve much more, keep making cool videos like this, and thank you for everything
No worries, I'm actually glad you were having fun instead of learning math and statistics!! 🤣🤣🤣 And thank you so much, I'm still shocked it's not just me alone watching this hahahaha
Hi Carlos, I'm not sure... I've never made any Chrome extensions before so I'm not the best person to ask 🙃 I would imagine this is usually done with HTML/CSS/Javascript instead of Python... 🤔 but I'm not really sure...
Awww! Why didn't you add this video over a year ago?! :D I was writing my master's thesis - image recognition (looking for a part-number on a sticker). Neural networks were one of the tested solutions (i tested also classic match-template and SIFT descriptors)! Maybe future videos about Convolutial Neural Networks, Intersection over Union? btw. awesome videos!!!
This is awesome, Adi!!! What was your conclusion? which is better?😀 Unfortunately I couldn't do it over a year ago as I was struggling with understanding it myself! hahaha 😂 It took me 2 years of online courses, dozens of articles, and 2 huge notebooks full of summarized information to get it right! It seems that people are inertially trying to make Machine Learning sound more complex than it actually is and it's pure evil! We will definitely get to object detection eventually, but CNN would probably come first as Pytorch has some really nice tools for that! 😉 I actually think you gonna like my "What the Flower?" project that classifies images of flowers (it was my very first project with VGG, so hopefully it's not too bad hahaha): github.com/MariyaSha/FlowerImageClassifier_GUI
@@PythonSimplified Match Template works only in strictly defined cases, no resistance to scale effect, rotation and perspective. Additionally, low computing efficiency. SIFT descriptors - faster to implement and to operate. SIFT descriptors are resilient to scale effect, rotation and change of perscpecive. Unfortunetly in this case, there was a problem of stickers - they were gray with black letters and of different sizes, so not every sticker had the right amount of characteristic points (key points). Finally - CNN, Convolution is immune to all negative effects - the problem was the training and test set of photos - i had to prepare everything myself (I've made 1000 photos, created all labels using PixelAnnotationTool ( github.com/abreheret/PixelAnnotationTool ) and i dynamically changed the background, orientation and more during training to create tons of photos (i've used backgrounds from this project: haptics.seas.upenn.edu/index.php/Research/ThePennHapticTextureToolkit ). In the end, everything worked out and the algorithm worked after many hours of training/testing and adjusting the parameters :P I've used Tensorflow - very nice tutorials (for begginers and advanced) on official site - take a look. I've read tons of articles and books :D Your project is very nice! I don't know about flowers, but my GF loves them - maybe now i'll be able to shine for once! :P
Great lesson. I have some questions, this kind of use of "machine learning", is it use for aplycations in data entries? Are you thinking of doing some more basic uses of Loss/Cross? The last one, what IDE are you using to code? thanks!
Hi Matias! 😀 I'm using Wayscript for my AI videos, it's a cloud IDE built by really nice and creative folks, you can check it out over here: wayscript.com/ I'm using their premium features, even though you can always create a free account. (I've filmed some videos for their channel not too long ago so they've upgraded me in the process 😉) As to Cross Entropy Loss - it is commonly used in statistics and data science. It belongs to the Probability branch of mathematics (for example: if I have 2 red apples and 2 green apples and I put all these apples in a dark box and randomly select an apple - what is the probability/chance that this apple is green?) In terms our perceptron example, when we initially chose a set of weights - we didn't really know if these weights will be good for our model or not. We then looked at all our data entries, one at a time, and asked - "when we combine this data entry with our set of weights - do we get the results we wanted to get? or do we get inaccurate results?" So if we know that a certain input represents a "chicken", but then the weights of our model adjust this input and they result in an output of "goat" - this means we have selected the wrong weights. If we selected the wrong weights - we need to find better set of weights that will properly identify our chicken. The Cross Entropy loss function helps us understand if we need to modify the weights by a lot (when the error term is closer to 1) or by a little (when the error term is closer to 0). Then, we take this error term and we use it to update the weights in such way, that our targets (the output we want to get) are a 100% match with all our predictions (the output we actually get). In the next episode I'll show you how to do it with gradient descent, but hopefully this explanation helped a bit in the meanwhile😁 Let me know! 😉
@@PythonSimplified Really really thanks for your explanation. Is a huge help for my self studie of Python. Combine math with code is amazing... it will be great that at the end of these videos series we can make some real aplication with Python in Machine learning.. My best wishes to you!
Hi, thank you for a great explanation ... why do we need log functions here? Thank you and greetings from Germany ... I am a belorussian source by the way :) Only one un-armored SUV at my doorstep ... oh no ... its my wife ... but she demands clarifications too :))))
for some reason when I run your version I get domain error? but for some reason this works: def cross_entrophy_loss(w_sum, y): w_sum = w_sum target = y return -(target*math.log(w_sum) + (1 - y) * math.log(1 - w_sum)) loss = 0 for entry in input_data: print('cost={}'.format(cross_entrophy_loss(entry[0], entry[1]))) loss += cross_entrophy_loss(entry[0], entry[1]) print(loss / len(input_data))
@@TimonNetherlands Hahaha I wish my Russian was good enough to teach other people!! I've left Crimea when I was 6 years old, and even though I've been speaking Russian with my family ever since - my vocabulary is stuck in kindergarten 🤣🤣🤣 Plus my accent is horrific, I need some lessons myself! hahaha 😅
Hi Aaron! 😀 Yes, Absolutely! Django is on my near future "to do" list (including SQLite stuff), I'm just hoping to cover Flask first as it's a bit more beginner-friendly. I'll post an update vlog in a few days, where I'll go over the upcoming tutorial and when to expect them 😉
..and it's _binary_ cross entropy loss, because there are only two options True/False? thanks for the video! thank you for not explaining cross entropy loss talking about the weather forecast! :O too bad people tend to talk more about your nail polish than about the quality of your video :)
I'm not surprised! 😊 One of the definitions of entropy is disorder or a lack predictability and it's a common term across many scientific branches. In the case of Cross Entropy it refer to the target not matching the prediction so our outcome remains unpredictable 🤓
No worries Siam! you can always watch it later! 😉 There's actually gonna be a new update + frequently asked question video coming next week, where there's a nice surprise waiting for you! 🎁
i dont get it at all. if allready have data that make it 1 or 0. why we need calculate yet again. this should be included to get 1 or 0 like allways LOL and chicken=0.9 confirmed its chicken chicken=0.5 is it chicken chicken=0.2 still might be chicken. can you do lottery prediction from last 10 draws or all of them lol. its just numbers its easy xD. well you cant you still doing youtube videos. no success prediction
@NASA, here's the info you need to land your rockets on Mars, don't let anyone know you got the info from a Russian source.
hahahahahaha that's my favourite comment so far, Tobs!!!! 🤣🤣🤣🤣🤣
Before you know it - 10 armored SUV with blackened windows park beside my building demanding clarification 😂
@@PythonSimplified That's ok, modify your model to calculate the likelyhood of that happening and you're good :D
Also you should ask for drivers too otherwise you should drive them by towing for your convoy, ha hahahah. Be careful on asking your demands😎😎
How much time do you take for replying huh becoz every reply by youris humor and nice, good.
This is awesome Ma, before watching this! spend several hours to understand the loss function but finally in less than 12 minutes I grasp the whole concept.
Next should be cross validation
Thanks in million
From Nigeria
Your simplicity demonstrates your great intelligence. This is the best video I've seen on this topic.
This series of videos is just amazing. Can't thank you enough for the time spent on them.
That this has any downvotes baffles me. The material and topic are really well presented.
A video on back propagating the cross entropy loss via code would be super helpful for how the weights and bias get updated. This video was def helpful in understanding the forward propagation
I was very lucky to discover your channel
your fantastic.....your explanation is great
Thank you so much, dexteuse freeman! 😀
I was worried this one may have too much math involved, so I'm super glad you liked the explanation!!! 😊
a nice girl teaching me cross entropy was all i needed rn
I was getting a different values in GOogle Calculator VS Python. Then I realised I used log() instead of log base 10.
Awesome explanation!!
I'm a very beginner and I though I had completely understood the concept until I watched your video, hehehe... Thank you very much! The quality of your content is unvaluable, since the edition, the colors, the whiteboarding until the whole information and examples! My favourite YT channel 😭 thanks again!
Pd. I also used the navbar on the right in my computer 😅👏🏽
Thank you so much for the incredible comment, Gaby!! I'm super happy you enjoy my tutorials!!! 😀😀😀
I find that the best way to understand ML algorithms is to take a pen and a piece of paper and do it manually without any computers involved 😉
Take the most basic example input such as: [0.2, 0.4, 0.1] and trace it layer by layer until all the formulas settle in and become intuitive.
Simple, clear. Gracias señorita
Thank you Daniel, glad you liked it! 😊
Love from Zimbabwe 🇿🇼
Hi Takesure! Wow, it the first time I see a comment from Zimbabwe here! 😀😀😀
Cheers from up North in Canada!!! 😊
Great video! I have never actually imported math module in my python career! Will try tonight.
I must mention that Numpy beats the math module in a heartbeat!! 😉 So you may have done a very wise decision not familiarizing with it hahaha 🤣
It's definitely nice if you're doing simple tasks such as .log10() - but Numpy rules when it comes to complex calculations!
@@PythonSimplified Yes, I use Numpy a lot. I try to avoid Pandas if I can.
You introduce Loss Function @1:35 with an example of distance from Squamish to Vancouver. That depends on how many routes are involved. If there's only one route, it's like a number line. If there are multiple routes, shortest, longest, and average can be calculated. But you introduce a Loss Function formula @3:39 that does not reflect that first example. What kind of math background is needed for beginners to unpack this formula that (I think) many find as not intuitive?
This girld should have a lot of suscriptors, here is my like, good job.
Thank you so much Alfredo, I couldn't agree more!!! 😁
(even though, I never imagined I'll have 40,000 lovely people watching my tutorials! 😉 It's still shocking to me!)
Yep, incredibly clear and we'll paced. Incredible teacher. Especially considering English isn't her native tongue...but I suspect she has a greater grasp of English than most English speakers!
You really have a gift for teaching thank you so much I am usually pretty lost when watching videos like these
Amazing explanation, you are awesome teacher
congo for 40k!!!
just saw your web scraping tutorilals and immediately subscribed :))))
one request- Please make tutorials on Machine Learning
Thank you so much Abhinav! Welcome aboard! 😁
I actually already have a bunch of videos about Machine Learning and I'll definitely be filming many more - checkout my playlist 😉:
th-cam.com/play/PLqXS1b2lRpYTpUIEu3oxfhhTuBXmMPppA.html
I like the way you make explain this concepts. It isn't easy but you're a very good teacher. Again, congrats for your goal and congrats for never surrender. 🧡
Thank you so much for the incredible feedback!! 😁 trying to do my best to make Machine Learning more approachable to as many lovely people as possible! 😀
might be the best channel i familiar with that as such a simplified and graet content. like all those visual effects and accessories. it looks like you put some serious effort into those vids.
hope you will be reworded as you deserve and I will try to spread the rumor :)
This video was helpful. I actually understand this loss function now. Thank you.
Girl coder! HOLY SMOKES Girl coder! Love your content!
hahahaha thank you! 😀 there are actually more of us out there then you may think 😉
Only in my close family circle there are 3 (well, including myself 😊)
😇 I Feel lucky that i found your channel, very impressed the way it is explained. Thanks a lot for video, hope that all part of machine learning concepts and concept related to data science will be comming in upcomming videos. I have a gratitude towards your dedication.
When steam is in the task bar, you know what's up :)
Thanks for the video.
hahahaha using my computer for anything other than gaming is like driving an AMG S65 in a 20km an hour speed!! 🤣🤣🤣
Thank you! 😊
Love Your Information and Presentation, Sweetie ~
The Nail-Polish IS YOU To Boot ~
Thank you so much, it was one definitely the most complex videos I've filmed so far so I'm super glad you liked it! 😀
Apparently the nail polish ended up matching the colour of the background graphics hahaha
Thank you so much for making it fun and easy to understand. Looking forward to more machine learning :D
Thank you so much Aamir, I'm super glad you found this fun and easy! 😁
Next on the menu - Gradient Descent and Backpropagation algorithm 😉
Engaging and enlightening.. Thanks a ton
You are such an incredible teacher! 🏅
Thank you so much Matt!! 😁 I'm glad you like my explanations! 😊
Nice conceptualization :D
In real ML problems though, with bigger datasets, you will want to vectorize the for loop to compute the loss, either using tensors (pytorch, tensorflow...) or numpy. It will be much faster.
Have you ever used C++ for machine learning? I only use python with Jupiter notebook
very good video;
very good series;
very beautyFul smile!
...UR taking an arguAbly complex subject
&, I must humbly opine, made it
much more approachAble for most
Yeeeeey!!! I'm so happy to hear that!!! 😁😁😁
I've struggled so long with finding simple information about AI & Machine Learning a few years back and it was extremely discouraging! When you're passionate about something - nothing should prevent you from exploring and expanding your knowledge, but it sure can take much longer when the information is intentionally complex.
You totally got it, my friend! my goal is to convey everything that took me 2 years to learn in much less time! I'm really hoping to make other people's AI journey friendlier than mine 😅
@@PythonSimplified God Bless.
my journey into ML (& Py) haz been 'forced' by virtue of my primary project of craft'ng a 'better' (alt~) Script [glyphSet] for the wRit'ng English [& other languages] more 'efficiently' - though someWhat subjective, I will soon publish (a) book(s respect'ng that project; I'd be delight'd to then [later this year] gift a copy of my book(set 2U.
This was explained perfectly! Thank you so much!! :)
Thank you Mariya very much. But shouldn't we replace w_sum with y hat in the equation of log loss?
Thanks again for another awesome helpful tutorial!
Although I'm late 🙇♂️, but present Ma'am🙋♂️.
hahahaha ok, then you won't lose any score on attendance! 🤣🤣🤣
Amazing!!! Congratulations, thanks for explain in details 👏👏👏
Thank you so much for the lovely feedback Michael! I'm really glad you liked it! 😀
Tolle Arbeit, danke für deine Mühe. Das Thema war klar und gut erklärt. =)
Thank you for fantastic explanation 🔥🔥🔥
Excellent tutorial.
I think I'm in love... with Python! 💘🐍
Very informative, thanks
very well explained... I have one confusion... based on what fact do we need to select the threshold value?
Thank you so much Sagar! 😃
I don't believe there's a strict rule to this - it usually depends on the data you're working with and what outcome you're trying to achieve.
A good solution is probably to test few different threshold values on a small fraction of your data and select the best one depending on the outcome.
Good luck and I hope it helps 😊
hi, how do you deal with a weighted sum greater than 1 which would mean np.log10(1 - weighted_sum) would return nan and lead to exceptions?
I have a question. Why does she directly insert the weighted sum into the cross entropy loss function and not the activation function or the output of that sum? For me it works with both the weighted sum and the sum after passing the sigmoid activation function. I don't know which is better or correct...
Update: For me using "y hat" instead of the weighted sum works better
You explained this concept very well! Subscribed! :)
thank you ma'am for making this video
Hey mate, can you do a video on making a python script which listens audio input from a Bluetooth speaker's mic and stream it to our pc.
Hi Abhishek! 😀
There should be as bunch of existing software that can do it... or did you mean voice recognition?
Like analyzing the sound and classifying it or converting it into strings?
@@PythonSimplified not voice recognition, just grabbing the voice from the speaker and sending it back to our pc.
tnx for recreating it .
Absolutely, Alireza! Thanks for watching! 😁
May you take us through the log formula used, what's the story around it, how was it formulated... Everything else is well explained.
Please 🙏 suggest me the book name you followed.
There are some really nice books out there, but I can't say that I followed one of them... I accumulated lots of information over the years and the more different angles you are exposed to - the better you understand!
With that said, my favorite work regarding the Perceptron is:
hagan.okstate.edu/4_Perceptron.pdf
It's by Martin Hagan of the Oklahoma State University and it's absolutely fantastic!!!
I really wish I found it earlier, it would have saved me lots of research 😉
Thanks alot.
OK, my brain went into shutdown halfway through this ;) I think you lost me at logarithms. But you're right: things I left behind in highschool are scary - but not unfathomable. I mean, I did pick up trigonometry after 20+ years - and now I'm the happiest little monkey, programming 3D spheres in CSS and Archimedean spirals in SVG. Both of which are vain, futile, totally unnecessary and utterly impractical - but I got over my video game addiction because this is just more fun :D
How are the weights determined?
You usually start with random weights and then you adjust them with Gradient Descent 😃 So you would usually create a data structure with Numpy, in the shape of your weights and then you fill it with random values.
I demonstrate the entire training process, including Gradient Descent, in another tutorial of mine:
⭐️ Train a basic neural network with NumPy and Pandas:
th-cam.com/video/xpPX3fBM9dU/w-d-xo.html
I hope it helps! 😁
Hey! Just wanted to ask a quick question. Do you think you could make a video about making spectrograms from data sets? I'm going into SETI and Astrobiology work and I will be looking at radio signals and other frequencies and turning the data into a spectrogram so you can see the data more clearly. If not its alright, thanks love the videos!
This definitely sound super interesting! I've been fascinated with audio processing for a while but I must admit - I don't believe I have enough knowledge at this point of time to properly explain amplitude and frequencies 😅
I am definitely planning to dive in there in the future, but I must learn it first before I can move on with teaching... So far, I've briefly worked with sound in P5.Js, where I did create spectrograms for a University thing, but I don't believe it's advance enough to be helpful in your situation, but definitely let me know if you want to have a peek - I can always upload it to my Github 😉
@@PythonSimplified oh I see hahaha, that's alright!! Its definitely a very specific topic. I appreciate the feedback and response and I would love to check it out on github :)
O vídeo esta muito bom e também bem explicado. Parabéns.
I just come from the swimming pool and just realized that I missed the premiere :/. However congrats on 40k Queen!!! You deserve much more, keep making cool videos like this, and thank you for everything
Btw here's 35'C, so hot I'm melting xD.
No worries, I'm actually glad you were having fun instead of learning math and statistics!! 🤣🤣🤣
And thank you so much, I'm still shocked it's not just me alone watching this hahahaha
Hey Mariya, great video but I was wondering if the formula is applicable for multi-class classification also.
So awesome 👌 thank you
could you make a chrome extension with python?
Hi Carlos, I'm not sure... I've never made any Chrome extensions before so I'm not the best person to ask 🙃
I would imagine this is usually done with HTML/CSS/Javascript instead of Python... 🤔 but I'm not really sure...
Awww! Why didn't you add this video over a year ago?! :D I was writing my master's thesis - image recognition (looking for a part-number on a sticker). Neural networks were one of the tested solutions (i tested also classic match-template and SIFT descriptors)! Maybe future videos about Convolutial Neural Networks, Intersection over Union? btw. awesome videos!!!
This is awesome, Adi!!! What was your conclusion? which is better?😀
Unfortunately I couldn't do it over a year ago as I was struggling with understanding it myself! hahaha 😂
It took me 2 years of online courses, dozens of articles, and 2 huge notebooks full of summarized information to get it right! It seems that people are inertially trying to make Machine Learning sound more complex than it actually is and it's pure evil!
We will definitely get to object detection eventually, but CNN would probably come first as Pytorch has some really nice tools for that! 😉 I actually think you gonna like my "What the Flower?" project that classifies images of flowers (it was my very first project with VGG, so hopefully it's not too bad hahaha):
github.com/MariyaSha/FlowerImageClassifier_GUI
@@PythonSimplified Match Template works only in strictly defined cases, no resistance to scale effect, rotation and perspective. Additionally, low computing efficiency. SIFT descriptors - faster to implement and to operate. SIFT descriptors are resilient to scale effect, rotation and change of perscpecive. Unfortunetly in this case, there was a problem of stickers - they were gray with black letters and of different sizes, so not every sticker had the right amount of characteristic points (key points). Finally - CNN, Convolution is immune to all negative effects - the problem was the training and test set of photos - i had to prepare everything myself (I've made 1000 photos, created all labels using PixelAnnotationTool ( github.com/abreheret/PixelAnnotationTool ) and i dynamically changed the background, orientation and more during training to create tons of photos (i've used backgrounds from this project: haptics.seas.upenn.edu/index.php/Research/ThePennHapticTextureToolkit ). In the end, everything worked out and the algorithm worked after many hours of training/testing and adjusting the parameters :P I've used Tensorflow - very nice tutorials (for begginers and advanced) on official site - take a look. I've read tons of articles and books :D Your project is very nice! I don't know about flowers, but my GF loves them - maybe now i'll be able to shine for once! :P
Great lesson. I have some questions, this kind of use of "machine learning", is it use for aplycations in data entries? Are you thinking of doing some more basic uses of Loss/Cross?
The last one, what IDE are you using to code? thanks!
Hi Matias! 😀
I'm using Wayscript for my AI videos, it's a cloud IDE built by really nice and creative folks, you can check it out over here:
wayscript.com/
I'm using their premium features, even though you can always create a free account. (I've filmed some videos for their channel not too long ago so they've upgraded me in the process 😉)
As to Cross Entropy Loss - it is commonly used in statistics and data science. It belongs to the Probability branch of mathematics (for example: if I have 2 red apples and 2 green apples and I put all these apples in a dark box and randomly select an apple - what is the probability/chance that this apple is green?)
In terms our perceptron example, when we initially chose a set of weights - we didn't really know if these weights will be good for our model or not. We then looked at all our data entries, one at a time, and asked - "when we combine this data entry with our set of weights - do we get the results we wanted to get? or do we get inaccurate results?"
So if we know that a certain input represents a "chicken", but then the weights of our model adjust this input and they result in an output of "goat" - this means we have selected the wrong weights. If we selected the wrong weights - we need to find better set of weights that will properly identify our chicken.
The Cross Entropy loss function helps us understand if we need to modify the weights by a lot (when the error term is closer to 1) or by a little (when the error term is closer to 0).
Then, we take this error term and we use it to update the weights in such way, that our targets (the output we want to get) are a 100% match with all our predictions (the output we actually get).
In the next episode I'll show you how to do it with gradient descent, but hopefully this explanation helped a bit in the meanwhile😁 Let me know! 😉
@@PythonSimplified Really really thanks for your explanation. Is a huge help for my self studie of Python. Combine math with code is amazing... it will be great that at the end of these videos series we can make some real aplication with Python in Machine learning.. My best wishes to you!
Very helpful!
Hi, thank you for a great explanation ... why do we need log functions here? Thank you and greetings from Germany ... I am a belorussian source by the way :) Only one un-armored SUV at my doorstep ... oh no ... its my wife ... but she demands clarifications too :))))
for some reason when I run your version I get domain error? but for some reason this works:
def cross_entrophy_loss(w_sum, y):
w_sum = w_sum
target = y
return -(target*math.log(w_sum) + (1 - y) * math.log(1 - w_sum))
loss = 0
for entry in input_data:
print('cost={}'.format(cross_entrophy_loss(entry[0], entry[1])))
loss += cross_entrophy_loss(entry[0], entry[1])
print(loss / len(input_data))
Same to me too !
😁👍congo for 40k subs beautifull..........
Thank you so much Yogesh!! 😁
I'm still processing that 40,000 lovely people clicked on that "subscribe" button!! holly smokes!!! 😍😍😍
@@PythonSimplified do you have Instagram or twitter account, i would love to follow you.
Make more video on kivvy please!
thanks alot !! nice video!!
Hey I have a doubt how did we get threshold 0.5?
TBH you should also start modelling channel 😂😁
Enough less intelligent women already doing that. I'd like to learn some Russian though...
hahaha thank you! you can add me on Instagram for that 🤣🤣
instagram.com/mariyasha888
@@TimonNetherlands Hahaha I wish my Russian was good enough to teach other people!!
I've left Crimea when I was 6 years old, and even though I've been speaking Russian with my family ever since - my vocabulary is stuck in kindergarten 🤣🤣🤣
Plus my accent is horrific, I need some lessons myself! hahaha 😅
i told her to do bikini python and she said even her bf suggested the same thing 😂
@@johnames6430 haha for some reason I think my wife won't believe me if I say I'm watching a Python video when the teacher is wearing a bikini
yours is cheongsam?
Good job...
Thank you so much^^
Hey! have you considered doing a tutorial in django? Its pretty simple and a really powerful framework.
Hi Aaron! 😀
Yes, Absolutely! Django is on my near future "to do" list (including SQLite stuff), I'm just hoping to cover Flask first as it's a bit more beginner-friendly.
I'll post an update vlog in a few days, where I'll go over the upcoming tutorial and when to expect them 😉
@@PythonSimplified That makes a lot of sense...😀
Just in time
Why doesn't the cross entropy loss function use the y-hat value?
אבל אם w_sum זה 0 או 1 יצא בעיה בלחשב את הlog לא?
..and it's _binary_ cross entropy loss, because there are only two options True/False?
thanks for the video! thank you for not explaining cross entropy loss talking about the weather forecast! :O
too bad people tend to talk more about your nail polish than about the quality of your video :)
I heard the entropy word in Thermodynamics in Chemistry and Physics
I'm not surprised! 😊
One of the definitions of entropy is disorder or a lack predictability and it's a common term across many scientific branches.
In the case of Cross Entropy it refer to the target not matching the prediction so our outcome remains unpredictable 🤓
@@PythonSimplified Entropy is basically randomness of a system
Nice and succinct.
It was perfect
Oh noooo.. I missed today.
No worries Siam! you can always watch it later! 😉
There's actually gonna be a new update + frequently asked question video coming next week, where there's a nice surprise waiting for you! 🎁
@@PythonSimplified ah for me? Thank you so much. That's so nice of you.
The best, simple and easy ❤, but only one problem: you're too beautiful, and sometimes it's hard to focus on tutorial 😂
Avengers dialogue thor to captain Marvel ladki ma dum hae explained everything in nice and easy way
Nice.
This code is so simple as unuseful
I thought beauty and coding were mutually exclusive
But Teacher Sha, what kind of hat is the chicken wearing?
I'm Arabic 🥰🥰 , You are beautiful 😘
I love you I am india
Binary Cross Entropy*
what? 😅
hahaha anything specific I can help with? 😂
@@PythonSimplified can I do that on "Scratch" ? 😅
Nothing special about this comment, I've only got used to write you a comment everyday.. you can skip 😜
hahahah I'm always happy to read your comments M Ibrahim! Whether they're "special" or not!!! 😃
@@PythonSimplified wow, this implies there are some "special" comments 🙃
Can't be happier 🤣🤣
i dont get it at all. if allready have data that make it 1 or 0. why we need calculate yet again. this should be included to get 1 or 0 like allways LOL and
chicken=0.9 confirmed its chicken
chicken=0.5 is it chicken
chicken=0.2 still might be chicken.
can you do lottery prediction from last 10 draws or all of them lol. its just numbers its easy xD. well you cant you still doing youtube videos. no success prediction
❤️
lovely coding with lovely girl
Whats your Instagram ID ?
Hi Suraj! 😁
You can find me at: instagram.com/mariyasha888
(I don't post much about Machine Learning or AI there though 😅 hahaha)
i wanna marry you.
good job