An epoch is normally a measure of time, in deep learning, it's a measure of the number of times the entire dataset is passed through the model structure. You can (will) overfit the data if you loop through the dataset too many times. The goal is to stop training when your model has learned to generalize without memorizing. How will you recognize that point? Generally, you will graph the loss from the training data and test / unseen data. While a model is "learning" these two curves will be roughly parallel. When the training loss curve continues to decline but the test / unseen data loss curve is no longer roughly parallel, you have reached the point of overtraining. I hope this makes sense.
You lost me at "If I give lesser epochs, the line will split the curve"... :P I think what he tries to say is, one epoch = one forward pass and one backward pass of ALL the training examples. Now that we know what an epoch is, then how many number of epochs do you want to run thorugh the neural network so it can train its weight properly? ANSWER: f you run ALL the training data several times, you might end up with overtrained weights. You want to find a balance between "less epochs" and "alot of epochs". Example, if "less epochs" is five runs (epochs), and "alot of epochs" is 100 runs. Then maybe the perfect balance to categorize new unseen images will be 40 runs.More than 40 runs (epochs) would have given you overfitting, and less epochs would have fiven you underfitting. How do you find the optimal number of epochs? Answer: I think it is just trial and error. Trying different number of epochs, and then evaluate how well it did on the new unseen data.
Instead of just trial and error, it is possible to plot the validation loss & training loss. Training loss will almost always go down with more epochs. We have to keep a lookout for validation loss. As long as it is decreasing with epochs, we are good to train.
Epoch is a bunch of data given to the neutral network from whole lot of data.. example.. while eating one full plate of rice, each spoon you eat is an epoch. You can only take little amount. Else you will vomit. If you take too small it will take a while to complete and you will be hungry for long.
We can divide the dataset of 2000 examples into batches of 500 then it will take 4 iterations to complete 1 epoch. ( A bit of thinking and you'll understand everything)
You are judging too quickly. :-) they were not well so I got them to my place. Since they got better I set them free again. Good news is they sometimes come back, chirp a little and go back.
An epoch is normally a measure of time, in deep learning, it's a measure of the number of times the entire dataset is passed through the model structure. You can (will) overfit the data if you loop through the dataset too many times. The goal is to stop training when your model has learned to generalize without memorizing. How will you recognize that point? Generally, you will graph the loss from the training data and test / unseen data. While a model is "learning" these two curves will be roughly parallel. When the training loss curve continues to decline but the test / unseen data loss curve is no longer roughly parallel, you have reached the point of overtraining. I hope this makes sense.
You lost me at "If I give lesser epochs, the line will split the curve"... :P
I think what he tries to say is,
one epoch = one forward pass and one backward pass of ALL the training examples.
Now that we know what an epoch is, then how many number of epochs do you want to run thorugh the neural network so it can train its weight properly?
ANSWER: f you run ALL the training data several times, you might end up with overtrained weights. You want to find a balance between "less epochs" and "alot of epochs".
Example,
if "less epochs" is five runs (epochs), and "alot of epochs" is 100 runs. Then maybe the perfect balance to categorize new unseen images will be 40 runs.More than 40 runs (epochs) would have given you overfitting, and less epochs would have fiven you underfitting.
How do you find the optimal number of epochs?
Answer: I think it is just trial and error. Trying different number of epochs, and then evaluate how well it did on the new unseen data.
Instead of just trial and error, it is possible to plot the validation loss & training loss. Training loss will almost always go down with more epochs. We have to keep a lookout for validation loss. As long as it is decreasing with epochs, we are good to train.
So epoch is the number of times you train your neural network with the same data?
in BGD, same data you meant over number of data in *BATCH SIZE*? Not over whole data.
Most practical & sensible suggestion on neutral nets.!
thank you brother....I am studying this during this pandemic because my uni has decided to take offline exams;;; :'( NEVER TAKE ADMISSION IN GTU NEVER
Same here..,😂gtu gtu..
same here :(
Sure Nidhi 😊
Soo basic and simple thanks bro
Still doesnt answer what is Epoch !
Epoch is a bunch of data given to the neutral network from whole lot of data.. example..
while eating one full plate of rice, each spoon you eat is an epoch. You can only take little amount. Else you will vomit. If you take too small it will take a while to complete and you will be hungry for long.
Did I answer you well ? Let me know I will try with different example if you still didn't understand
@@RoboticsExplained Hi, I liked your broad view approach but i was looking for mathematical explanation.
@@RoboticsExplained thanks sir..best example
@@ateeqrehman7873 You are welcome ! :-)
wasted 1.26 minutes
We can divide the dataset of 2000 examples into batches of 500 then it will take 4 iterations to complete 1 epoch.
( A bit of thinking and you'll understand everything)
@@ahaanabrar6684 Thanks for the example
Thanks
Thanks bro💞
please provide them a big cage
Thank you, for making it simple.
Thank you, man
thank you so much, but please do something about your handwriting
Kya shuruwat thi.. chidiya in the cage..😒
what explanation is this?
Like sigmoid
kya tha bhai ya ................hadd ha teri yaar.............
C L E A N
ok but why you enslaving birds in a cage, in such a small one even. evil human.
You are judging too quickly. :-) they were not well so I got them to my place. Since they got better I set them free again. Good news is they sometimes come back, chirp a little and go back.
@@RoboticsExplained OK Good human then. sorry.
@@Acumen928 🤣🤣🤣
Still doesnt answer what is Epoch !
haha true
an epoch is one pass (forward and backward) of the entire dataset through the Neural Network architecture.