Misra, thank you so much for putting together these videos. They helped me out a lot. Seeing you walk through a simple example made a lot of things click for me vs. just seeing random notation in a textbook that I never fully understood.
Memang buat pemula seperti saya belajar ini terasa agak sulit ' biar cepet memahami saya terus mengulang ulang melihat videonya ' terima kasih ' semoga sehat selalu
Hey thanks for the great video! How would the formula be, if there would be a second hidden layer? assuming B would be the output of the second layer, W2^T would be the weight matrix of the second layer, b2 its bias and so on B=beta{W2^T*[alpha(W1^T*X+b1)]+b2} is this correct? thanks for any help
I think you need to have two columns in the W transpose matrix. They are now presented as a single column with w1w2 as a product form. For matrix multiplication, number of columns in W transpose matrix should be equal to the number of rows in X matrix (2).
Misra, thank you so much for putting together these videos. They helped me out a lot. Seeing you walk through a simple example made a lot of things click for me vs. just seeing random notation in a textbook that I never fully understood.
Glad it was helpful!
This is awesome. Right level of detail balanced with simplification. Great work!
Thanks for the time and effort you put in these lessons and put them for free.. Really appreciate it..💐
You're very welcome Jameel!
great detail for new learners, great job
You have presented things in a very simple and comprehensive manner. Thanks!
You're very welcome :)
Thank You so much Mam'm. Really appreciate your efforts. You explain all the things in an easy way.
Simple and Understandable, Thank you 🙏
You're welcome :)
Thanks, I understand better now. But are the slides correct at 9:11 and 12:13?
Yep
Thanks a lot! Its very good to understand ...
I'm a bit confused at 12:34, how does `(w3x1 + w4x2) + b2` turn into `(w1x1 + w2x2) + b1`?
so i first saw you Assemble Ai, loved your content, thank you so much
Glad you enjoyed!
Memang buat pemula seperti saya belajar ini terasa agak sulit ' biar cepet memahami saya terus mengulang ulang melihat videonya ' terima kasih ' semoga sehat selalu
Malay?
Thanks ma'am- very useful
Hey thanks for the great video!
How would the formula be, if there would be a second hidden layer?
assuming B would be the output of the second layer, W2^T would be the weight matrix of the second layer, b2 its bias and so on
B=beta{W2^T*[alpha(W1^T*X+b1)]+b2}
is this correct?
thanks for any help
Great ....... keep continue.
:)
clear and precise thank you
Glad it was helpful!
I think you need to have two columns in the W transpose matrix. They are now presented as a single column with w1w2 as a product form.
For matrix multiplication, number of columns in W transpose matrix should be equal to the number of rows in X matrix (2).
Disagree
Please share the slides and course notes for Lesson 2, Module 1.
Thanks for the heads up. They're there now.
RELU, if the value is less than zero, relu returns 0, if greater than zero, relu returns the value (unchanged).
Well?
@@bay-bicerdover well what?
This is all fine, but I have no idea what all of this means as far as learning. I need to have some sort of practical something.