Forward Propagation and Backward Propagation | Neural Networks | How to train Neural Networks

แชร์
ฝัง
  • เผยแพร่เมื่อ 27 ก.ย. 2024

ความคิดเห็น • 98

  • @CodingLane
    @CodingLane  3 ปีที่แล้ว +10

    If you found this video helpful, then hit the *_like_* button👍, and don't forget to *_subscribe_* ▶ to my channel as I upload a new Machine Learning Tutorial every week.

    • @arpit743
      @arpit743 2 ปีที่แล้ว

      Excellent video! .Bro why do we have multiple nuerons in every hidden layer. is it from the point of view of introducing non linearity?

    • @CodingLane
      @CodingLane  2 ปีที่แล้ว

      @@arpit743 Yes, but not entirely. Multiple neurons allow us to capture complicated patterns. A single neuron won’t be able to capture complicated patterns from the dataset.

    • @arpit743
      @arpit743 2 ปีที่แล้ว

      @@CodingLane thanks alot! but why is that it allows for complicated boundaries?

    • @Sigma_Hub_01
      @Sigma_Hub_01 2 ปีที่แล้ว +1

      @@arpit743 more refined outputs will allow to see the limitations to your network boundaries...and hence u can pinpoint at exact location and correct it as per your needs. It doesn't allows for complicated boundaries , u are ALLOWED to see your complicated boundaries,and hence work thru it

    • @debanjannanda2081
      @debanjannanda2081 9 วันที่ผ่านมา

      Sir,
      their is a mistake on timestamp 0.41 where
      a2[1] is wrong , A2[2] is the activation function you written second becouse the value you multiply with the weight in weighted sum those value of a is for first hidden layer which use to find the second hidden layer value A2[2] not a2[1].
      But, you really teach great.
      thank you...

  • @PrithaMajumder
    @PrithaMajumder 2 หลายเดือนก่อน +2

    Thanks a lot for This Amazing Introductory Lecture 😊
    Lecture - 2 Completed from This Neural Network Playlist

  • @saumyaagrawal7781
    @saumyaagrawal7781 27 วันที่ผ่านมา

    This was more helpful than my lectures!

  • @sushantregmi2126
    @sushantregmi2126 ปีที่แล้ว +1

    so glad I found this channel!!

    • @CodingLane
      @CodingLane  ปีที่แล้ว

      Thank you! I appreciate your support 😇

  • @nishigandhasatav3559
    @nishigandhasatav3559 ปีที่แล้ว +1

    Absolutely loved the way you explain. So easy to understand. Thank you

  • @social.2184
    @social.2184 5 หลายเดือนก่อน

    Very informatics video.Explained all the terms in a simple manner.Thanks alot

  • @ahmeterdonmez9195
    @ahmeterdonmez9195 14 วันที่ผ่านมา +1

    at 0:58 in a1[1] = activation(....), last sum should be W13[1]*a3[0] not W13[1]*a3[1]

  • @kunalbahirat7795
    @kunalbahirat7795 2 ปีที่แล้ว +1

    best video on youtube for this topic

    • @CodingLane
      @CodingLane  ปีที่แล้ว

      Thank you so much. Much appreciate your comment! 🙂

  • @AryanSingh-eq2jv
    @AryanSingh-eq2jv ปีที่แล้ว

    best explanation, best playlists
    I don't usually interact with the algorithm much by giving likes and dropping comments or liking but you beat me into submission with this. Hopefully I understand the rest of it too lol.

  • @petchiammala1430
    @petchiammala1430 2 ปีที่แล้ว +1

    Super sir. I have learned more information from this and also calculation way. It's very useful to our study. Thank you sir

  • @rawanmohammed5552
    @rawanmohammed5552 3 ปีที่แล้ว +1

    You are great. It will be very good if you continue.

    • @CodingLane
      @CodingLane  3 ปีที่แล้ว

      Thank you for your support! I will surely continue making more videos.

  • @nooreldali7432
    @nooreldali7432 ปีที่แล้ว

    Best explanation I've seen so far

  • @agrimgupta3221
    @agrimgupta3221 3 ปีที่แล้ว +2

    Your videos on neural networks are really good. Can you please also upload videos for generalized neural networks too, that would really be helpful P.S Keep Up the good work!!!

    • @CodingLane
      @CodingLane  3 ปีที่แล้ว +1

      Thank you so much for your feedback. I will surely consider making videos on generalized neural networks.

  • @bincybincy1
    @bincybincy1 4 หลายเดือนก่อน

    This is so well explained.. thankyou

  • @johnalvinm
    @johnalvinm ปีที่แล้ว

    Very helpful and to the point and correct!

  • @venompubgmobile7218
    @venompubgmobile7218 3 ปีที่แล้ว +6

    Im a bit confuse through the exponent notations since some of it were not corresponding to the other

  • @chandanpramanik4399
    @chandanpramanik4399 10 หลายเดือนก่อน

    Nicely explained. Keep up the good job!

  • @kewtomrao
    @kewtomrao 3 ปีที่แล้ว +2

    Isnt the equation : Z= W.X+B = transpose(W)*X + B.Hence the weight matrix what you have given is wrong right?

    • @CodingLane
      @CodingLane  3 ปีที่แล้ว +2

      Hi... I have taken the shape of W as (n_h, n_x). Thus equation will be Z = W.X + B. But if you take W as (n_x, n_h), then equation of Z = transpose(W).X + B.
      Both represent the same thing. Hope it helps you.

    • @kewtomrao
      @kewtomrao 3 ปีที่แล้ว

      @@CodingLane thanks for the quick clarification.makes sense now.keep up the great work!!

  • @omarsheetan4417
    @omarsheetan4417 3 ปีที่แล้ว +1

    Great video, and great explanation thanks dude!

  • @Swarnajit_Saha
    @Swarnajit_Saha ปีที่แล้ว

    Your videos are very helpful. It will be great if you sort the video..Thank you😇😇😇

  • @waleedrafi1509
    @waleedrafi1509 3 ปีที่แล้ว +1

    great video,
    Please also make a video on SVM as soon as possible

    • @CodingLane
      @CodingLane  3 ปีที่แล้ว

      Okay Sure ! Thank you so much for your suggestion. I have been asked alot, to make video on SVM. So, I will try to make it just after finishing this Neural Network playlist .

  • @michaelzheng951
    @michaelzheng951 ปีที่แล้ว

    Fantastic explanation. Thank you

  • @muhammadrabbanizainalabidi2409
    @muhammadrabbanizainalabidi2409 2 ปีที่แล้ว +1

    Good Explanation !!

  • @ishayatfardin7
    @ishayatfardin7 ปีที่แล้ว

    Brother your explanation was great but there are some mistakes i have pointed out.

  • @iZeyad95
    @iZeyad95 2 ปีที่แล้ว +1

    Amazing work, keep it going :)

  • @premkumarsr4021
    @premkumarsr4021 6 หลายเดือนก่อน

    Super Bro❤❤❤❤

  • @testyourluck3914
    @testyourluck3914 ปีที่แล้ว

    B1 and B2 are initialized randomly too ?

  • @vipingautam9501
    @vipingautam9501 2 ปีที่แล้ว +1

    Small doubt, what is f(z1)...I am assuming these are just different type of activation functions...where input is just the weight of current layer*input from previous layers...is that correct?

    • @CodingLane
      @CodingLane  2 ปีที่แล้ว

      Yes correct… but do check out the equations properly. It has bias also.

    • @vipingautam9501
      @vipingautam9501 2 ปีที่แล้ว

      @@CodingLane Thanks for your prompt response.

  • @gautamthulasiraman18
    @gautamthulasiraman18 2 ปีที่แล้ว +1

    Sir it's W ¹¹[¹] * a⁰[1] right? You've done it as W ¹¹[¹] * a¹[1] at the matrix multiplication, can you just verify I'm wrong?

    • @CodingLane
      @CodingLane  2 ปีที่แล้ว

      Yes… there is a typo error

  • @taranerafati9730
    @taranerafati9730 ปีที่แล้ว

    great video

  • @xinli3642
    @xinli3642 2 ปีที่แล้ว +1

    Can A* actually be Z*, e.g. A1 = Z1?

    • @CodingLane
      @CodingLane  ปีที่แล้ว

      No, we need to apply a non-linear activation function. So A1 must be = some_non_linear_function(Z1)

  • @fadhliana
    @fadhliana 3 ปีที่แล้ว +1

    hi, how to calculate the cost?

    • @CodingLane
      @CodingLane  3 ปีที่แล้ว

      You will get all the information in upcoming videos that I have already uploaded in this series.
      If you still have questions, then you can write me mail on : codeboosterjp@gmail.com

  • @gitasaheru2386
    @gitasaheru2386 ปีที่แล้ว

    Please share code algorithm backpropagate

  • @bezelyesevenordek
    @bezelyesevenordek 10 หลายเดือนก่อน

    let bro cook

  • @mythillian
    @mythillian 6 หลายเดือนก่อน

    5:04

  • @UmerMehmood-n3f
    @UmerMehmood-n3f 27 วันที่ผ่านมา

    Extremely confusing tutorial and there's a mistake
    This should be :
    A[3]⁰ not A[3]¹

  • @kheireddine7889
    @kheireddine7889 2 ปีที่แล้ว +7

    this video should be titled " Explain - Forward and Backward Propagation - to Me Like I'm Five. Thanks man you saved me a lot of time.

    • @CodingLane
      @CodingLane  2 ปีที่แล้ว +1

      One of the Best Comments I have seen. Thank you so much! And thanks for the title idea 😂😄

    • @CodingLane
      @CodingLane  2 ปีที่แล้ว +1

      One of the Best Comments I have seen. Thank you so much! And thanks for the title idea 😂😄

  • @coverquick490
    @coverquick490 2 ปีที่แล้ว +7

    I've always felt as if I was on the cusp of understanding neural nets but this video brought me past the hump and explained it perfectly! Thank you so much!

    • @CodingLane
      @CodingLane  ปีที่แล้ว

      I am really elated hearing this. Glad if helped you out. Thank you so much for your appreciation. 🙂

  • @alpstech
    @alpstech 2 หลายเดือนก่อน +1

    You drop something ... 👑

    • @CodingLane
      @CodingLane  2 หลายเดือนก่อน

      haha.. what is it? Thanks btw

  • @whoooare20
    @whoooare20 2 ปีที่แล้ว +3

    you explained in very clear and easy ways. Thank you, this is so helpful!

  • @priyanshshankhdhar1910
    @priyanshshankhdhar1910 10 หลายเดือนก่อน

    wait you haven't explained backpropagation at all

  • @ayushipatel1315
    @ayushipatel1315 ปีที่แล้ว +1

    Literally best. Crisp and clear!! Thank you

  • @nothing5987
    @nothing5987 3 ปีที่แล้ว +1

    hi can you put caption option

    • @CodingLane
      @CodingLane  3 ปีที่แล้ว

      Hi.. somehow captions were not generated in this video. All my ohter videos do have caption. I will change the settings to bring caption in this video as well. Thanks for bringing this to my attention.

  • @gauravshinde8767
    @gauravshinde8767 5 หลายเดือนก่อน

    Lord Jay Patel

  • @abdallahlakkis449
    @abdallahlakkis449 ปีที่แล้ว

    y no subtitles?

  • @faisaljan3884
    @faisaljan3884 2 ปีที่แล้ว +1

    what is this B1

  • @ibrahimahmethan586
    @ibrahimahmethan586 2 ปีที่แล้ว +1

    Good job. But Gradient descent W2 and W1 mus be updated simultaneously.

    • @CodingLane
      @CodingLane  2 ปีที่แล้ว +1

      Thank you! Yes they should be updated simultaneously.

  • @marcoss2ful
    @marcoss2ful ปีที่แล้ว

    where did came from the algorithm that calculates the next W in 5:30 ? I know it is intuitive, but does it have something to do about Euler's method ? Or another one ?
    Thank you so much for these incredible videos

  • @harshwardhankurale310
    @harshwardhankurale310 2 หลายเดือนก่อน +1

    Top Class Explanation!

    • @CodingLane
      @CodingLane  2 หลายเดือนก่อน

      Glad it was helpful!

  • @kenjopac4247
    @kenjopac4247 2 ปีที่แล้ว +1

    This was actually pretty straight forward

    • @CodingLane
      @CodingLane  2 ปีที่แล้ว

      Glad if it helped you!

  • @farabiislam2418
    @farabiislam2418 ปีที่แล้ว +1

    You explain better than popular course instructor on deep learning

    • @CodingLane
      @CodingLane  ปีที่แล้ว

      Thanks for the compliment 😇

    • @sajan2980
      @sajan2980 ปีที่แล้ว

      I am sure he is talking about Andrew Ng Lol. His explanation on that video is too detailed and the notations are too confusing lol. But the same explanation in his Machine Learning Specialization course is much better.

  • @PrinceKumar-el7ob
    @PrinceKumar-el7ob 3 ปีที่แล้ว +1

    thank u sir it was really helpful

  • @maximillian7310
    @maximillian7310 2 ปีที่แล้ว +1

    Thanks man. The slides were amazingly put up.

    • @CodingLane
      @CodingLane  2 ปีที่แล้ว

      Thank you so much!

  • @sabeehamehtab6954
    @sabeehamehtab6954 3 ปีที่แล้ว +1

    Awesome, really helpful! Thank you

  • @VC-dm7jp
    @VC-dm7jp 2 ปีที่แล้ว +1

    Such a simple and neat explanation.

  • @DAYYAN294
    @DAYYAN294 ปีที่แล้ว

    Excellent explanation jazakallah bro

  • @babaabba9348
    @babaabba9348 3 ปีที่แล้ว +1

    great video as always

    • @CodingLane
      @CodingLane  3 ปีที่แล้ว

      Thank You soo much !!!

  • @mdtufajjalhossain1246
    @mdtufajjalhossain1246 3 ปีที่แล้ว +1

    you are really awesome. love your teaching ability

    • @CodingLane
      @CodingLane  3 ปีที่แล้ว +1

      Thank you so much !

    • @mdtufajjalhossain1246
      @mdtufajjalhossain1246 3 ปีที่แล้ว +1

      @@CodingLane, you are most welcome bro. Please make the implementation of Multiclass Logistics Regression using OnevsAll/OnevsOne method

    • @CodingLane
      @CodingLane  3 ปีที่แล้ว

      @@mdtufajjalhossain1246 Okay! Thanks for suggesting!