- 8
- 131 135
Power H
United Kingdom
เข้าร่วมเมื่อ 4 พ.ค. 2020
This channel has three main focuses:
1 - Teaching you the fundamentals of machine learning and deep neural networks.
2 - Helping you understand the latest cutting edge research in the field of computer vision.
3 - Implementing code and building production ready applications using the latest open source frameworks and libraries in the field of machine learning.
1 - Teaching you the fundamentals of machine learning and deep neural networks.
2 - Helping you understand the latest cutting edge research in the field of computer vision.
3 - Implementing code and building production ready applications using the latest open source frameworks and libraries in the field of machine learning.
The Sigmoid Function Clearly Explained
In this video we discuss the sigmoid function.
The sigmoid function plays an important role in the field of machine learning and is considered as one of the most widely used so-called activation functions.
More specifically, in the context of logistic regression, the sigmoid is used to predict the outcome of binary classification problems.
#sigmoid #machinelearning #logisitcregression #binaryclassification
The sigmoid function plays an important role in the field of machine learning and is considered as one of the most widely used so-called activation functions.
More specifically, in the context of logistic regression, the sigmoid is used to predict the outcome of binary classification problems.
#sigmoid #machinelearning #logisitcregression #binaryclassification
มุมมอง: 109 436
วีดีโอ
Multilayer Perceptrons - Ep.6 (Deep Learning Fundamentals)
มุมมอง 13K4 ปีที่แล้ว
In this sixth episode of the Deep Learning Fundamentals series, we will build on top of the previous part to showcase how Deep Neural Networks are constructed from a single basic entity called the Perceptron. The more complex our classification problem, the more Perceptrons we need to add to our Deep Neural Networks. We then introduce naming conventions used in the literature of Deep Learning. ...
Outline - Ep.0 (Deep Learning Fundamentals)
มุมมอง 2.3K4 ปีที่แล้ว
In this series of videos, we will explore the fundamental concepts underlying Deep Learning and Deep Neural Networks. #deeplearning #deepneuralnetworks #neuralnetworks #classification #machinelearning #datascience
Deep Neural Networks - Ep.5 (Deep Learning Fundamentals)
มุมมอง 9194 ปีที่แล้ว
In this fifth episode of the Deep Learning Fundamentals series, we will construct, step by step, a Deep Neural Network using four important principles. These principles revolve around a process we refer to as embedded linearity. This process consists of multiple linear hypothesis (i.e. predictions with a linear summation and an activation function) embedded into one another. The contributions f...
Non Linear Classification - Ep.4 (Deep Learning Fundamentals)
มุมมอง 1.5K4 ปีที่แล้ว
In this fourth episode of the Deep Learning Fundamentals series, we discuss non linear classification problems. Real world problems that can be solved with a simple linear binary classification are very limited. Non linear classification problems can be either binary (i.e. prediction is either 0 or 1) or multi class (i.e. prediction is more than two outcomes). We will show how regression analys...
Linear Binary Classification - Ep.3 (Deep Learning Fundamentals)
มุมมอง 1.7K4 ปีที่แล้ว
In this third episode of the Deep Learning Fundamentals series, we discuss the simple case of linear binary classification. In linear binary classification, the decision boundary (i.e. the border separating the data classes) can be expressed using the equation of a straight line, a 2D plane, or a hyperplane, depending on the number of input features. Episode 1 link: th-cam.com/video/7FtQ-Ski0gE...
Naming Conventions In Deep Learning - Ep. 2 (Deep Learning Fundamentals)
มุมมอง 8514 ปีที่แล้ว
In this second video of the Deep Learning Fundamentals series, we discuss some relevant nomenclature and talk broadly about the two main routes to solving classification problems. Namely: regression analysis and deep neural networks. Episode 1 link: th-cam.com/video/7FtQ-Ski0gE/w-d-xo.html #classification #deeplearning #neuralnetworks #regression #machinelearning #datascience
Classification in Deep Learning - Ep.1 (Deep Learning Fundamentals)
มุมมอง 1.5K4 ปีที่แล้ว
In this first video in the Deep Learning Fundamentals series, we will introduce to the most important concept underlying Deep Learning, which is: the problem of classification. #deeplearning #classification #neuralnetworks #machinelearning #datascience
Good explanation!
cheers govna
well explained👏
this is great 🔥
thank you, great explanation.
Thank you so much now the intution for the formula for logistic regression for classification problems became much more clear 😀
Thank you for the clear explanation.
Thank you 😊
Best visual explanation!
Thank you
Finally someone who understands it. I can`t believe how people lack ground knowledge in plotting functions, but are already studying Maschine Learning. Thank you very much.
This is fantastically visualized and explained! :D
Very good
Thank you so much for this detailed and informative video!!
Very helpful!! Keep up the good work. Thanks
very useful thank you!
Thank you. Good Explanation
Unfortunately, mathematicians cannot explain mathematics to non-mathematicians 😅. If i would explain the Sigmoid function clearly, I would start with the problem itself and what should be solved, before I start with any numbers and formulas.
Thanks!
so well presented in digestible junks of lessons
Thank You
Clear explanation
when i codded a sigmoid function and inputted -10 the value was greater that 1
Then your code is wrong
WOW just WOW. what a perfect and clear explanation.💯
bros negative pronounce seems to be raciest lmao😆😆😆😆😆
😂bruh u mentioned that, and now i cannot ignore it
Clearly explained indeed.
The inverse of e^x is not e^-x. That would be the reciprocal. The inverse is ln(x).
Both notations are same right? 🤔
Well it is the multiplicative inverse
Simply yet effectively explained. Please consider making more videos & continue. I visited you channel, that was your great initiative.
You should create more videos. Beautifully explained.
Excellent .. video i am using your videos for my deep learning examination
Wow nice explained
diminishing marginal returns indeed - lets change a few benchmarks 2023!!
hmm inverse of y=e^x is y=ln(x), yea? you were just putting a negative over the x not the inverse. But super nice video of sigmoid func!
I think he meant reciprocal.
👏
What a wonderful video!
Waiting for new videos bro. Excellent explanation 😁🤠
so in other words, the sigmoid function should be used only WHILE training, then testing should just use a comparison operator to see if the value is more or less than 0.5?
Great explanation. Thanks
EXCELLENT explanation. Thank you!
So interesting
Keep it up guys we really appreciate your effort
wow, amazing !!
very enlightening, thanks for it
Nice porn music intro
thank you, clear, concise, and I liked the graphics
The best explanation ever
Great info Thank you!!! where is the rest?
Wow, thus a very great explanation
This is the clearest explaination ever made. The world needs more people like you my friend. Thank you so much, you are a game changer.
This is a master piece