I Built a Neural Network from Scratch

แชร์
ฝัง
  • เผยแพร่เมื่อ 9 มิ.ย. 2024
  • Don't click this: tinyurl.com/bde5k7d5
    💚 Link to Code: / greencode
    How I Learned This: nnfs.io/ (by the awesome ‪@sentdex‬ )
    I'm not an AI expert by any means, I probably have made some mistakes. So I apologise in advance :)
    Also, I only used PyTorch to test the forward pass. Apart from that, everything else is written in pure Python (+ use of Numpy).
    ⭐ Other Social Media Links:
    🔊 Discord: / discord
    🐦 Twitter: / thegreencoding
    📸 Instagram: / greencodecodes
    🎵 Tiktok: / greencodecodes
    🔊 Music I Used in this Video: share.epidemicsound.com/7i1d0b
    Current Subs: 14,219
  • วิทยาศาสตร์และเทคโนโลยี

ความคิดเห็น • 268

  • @Green-Code
    @Green-Code  21 วันที่ผ่านมา +127

    I'm not an AI expert by any means, I probably have made some mistakes. So I apologise in advance :)
    Also, I only used PyTorch to test the forward pass. Apart from that, everything else is written in pure Python (+ use of Numpy).

    • @CrushedAsian255
      @CrushedAsian255 21 วันที่ผ่านมา +12

      "umm Numpy is a 'prebuilt framework'" ☝🤓

    • @przemysawtomala9304
      @przemysawtomala9304 21 วันที่ผ่านมา +3

      5:44 gradient descent won't ever jump like that with your implementation (without momentums), cause function in that place is rising

    • @ProgrammerRajaa
      @ProgrammerRajaa 19 วันที่ผ่านมา

      Is that code available in GitHub if so can you share the link plse

    • @genildademeloescrituraviva
      @genildademeloescrituraviva 15 วันที่ผ่านมา

      Building a neural network from scratch? what about IN scratch?

    • @Daniel_Massicotte
      @Daniel_Massicotte 15 วันที่ผ่านมา +3

      The only reason I can see for this, is learning the exact math involved in it.. but your solution is useless in a real world application, due to the simple fact that using Python in the mathematical part of the neuron (weight and bias calculations), will be slow as hell, and almost impossible to train something for real, at least in an acceptable period of time, in comparison (in my case) with calls made to Tensorflow that itself uses cuda library to TRAIN the network using all my gpu`s cores at their maximum speeds and in parallel processing, at thousands of times faster than PYTHON shitty scripting interpreted pseudo language for DUMB... at least use c++ or something similar for a bit of performance...
      Python, despite the fact that I hate it, have it`s place in some very specific applications, like writing a fast and easy solution for something that will be executed one, like creating a neural network , that itself will be running in cuda/tensor cores, but NEVER use Python for CPU-intensive tasks... it will uses 10x to 100x the resources (memory needed and cpu load) compare to c++ compiled for the same task, or if you prefer, it will takes 10x to 100x more time to execute... or last way of seeing it: it will takes 10x to 100x more energy to do the same thing...
      And it is true that Numpy will accelerate a lot of calculations, but as long as python is used, you will never reach the maximum speed a c++, compiled application can deliver...
      All that just to make sure everyone understands that Python has a specific area where it is very useful, but never use it for all tasks, it was not created for it.
      Simple rule: ask yourself `will the code be performed many times`, enough to slow down the task, so better use c++, that will take longer to code, but will be way faster in a cpu demanding task, saving time in the long run (no so long !!!)
      And if you are learning Python, because it is simplier than C++... Let me tell you that the difference between them are not as huge as you think... and for the ones that want to make a living out of programming..
      Python will bring you jobs that are paid less than 100k per year, c++ can easily gets you jobs paid at 250k per year...

  • @fragly
    @fragly 21 วันที่ผ่านมา +214

    Ngl using that chef hypothetical is such a neat way of explaining how a neural network functions

  • @burrdid
    @burrdid 20 วันที่ผ่านมา +327

    now do it in scratch

    • @zennihilator3724
      @zennihilator3724 18 วันที่ผ่านมา +15

      What does scratch mean? Does he have to make his own programming language too? Does he have to make his own computer? Does he have to design his own pcbs inside the computer? Does he have to put his own layer of silicon in a resin case and dope it? Does he have to generate his own electricity to power his house?

    • @KA-kf4ke
      @KA-kf4ke 17 วันที่ผ่านมา +26

      Scratch.
      'Website' Scratch.
      Y'know... the coding language for babies...

    • @陸
      @陸 16 วันที่ผ่านมา

      ​@@zennihilator3724scratch the programming language

    • @insaanonline
      @insaanonline 15 วันที่ผ่านมา +4

      ​@@zennihilator3724that means explain all the code one by one

    • @mesh_devo
      @mesh_devo 14 วันที่ผ่านมา

      @@zennihilator3724
      it has 2 meanings
      first one is from zero or step by step
      exp : i built my web from scratch ( means manually from zero)
      second
      is SCRATCH a web and app that help you to create simple games

  • @SomethingSmellsMichy
    @SomethingSmellsMichy 21 วันที่ผ่านมา +33

    3:12 the equation states that the loss of a network that returns probabilities with values from 0 to 1 is the expected output × the negative log of the actual output. The reason this works is because -log from 0 to 1 gives a higher loss as the probability approaches 0 and almost no loss as it approaches 1. Multiplying that by the expectes probability makes it so that the network only adjust the values for the outputs you want to approach 1.

    • @victor3btn598
      @victor3btn598 19 วันที่ผ่านมา +1

      Binary Cross entropy

    • @Exkolix
      @Exkolix 12 วันที่ผ่านมา +1

      this is why love nerds🤩

  • @ShabJimJets
    @ShabJimJets 21 วันที่ผ่านมา +3

    very nice mr green code very nice. You deserve a lot more subs for how good these videos are. Cant wait to see what the future holds

  • @DK-ox7ze
    @DK-ox7ze 21 วันที่ผ่านมา +7

    It's pretty cool to implement all this from scratch. I had studied all this a few months ago but forgot most of it because I never practiced it. But this served as a refresher.

    • @NihadBadalov
      @NihadBadalov 15 วันที่ผ่านมา

      hi, where did you study it?

  • @Ari-pq4db
    @Ari-pq4db 16 วันที่ผ่านมา +2

    Subscribed, can't wait for such more informative videos ❤🔥

  • @Hangglide
    @Hangglide 8 วันที่ผ่านมา

    thanks! really cool! especially I just learned neural network and watching your video reinforced what I just learned from the class.

  • @chrisx2342
    @chrisx2342 8 วันที่ผ่านมา

    ur expalination is absolutely fantastic!!

  • @Podcast.Motivator
    @Podcast.Motivator 14 วันที่ผ่านมา

    Awesome bro. Waiting for more videos like this.

  • @secondlive1
    @secondlive1 21 วันที่ผ่านมา

    This is a great video, keep making them!

  • @baonguyen4278
    @baonguyen4278 14 วันที่ผ่านมา

    Great work bro! u got another subscriber !!

  • @yassinechritt8816
    @yassinechritt8816 21 วันที่ผ่านมา

    Great explanation! keep on postig great stuff 😎😎

  • @虚
    @虚 21 วันที่ผ่านมา +4

    Amazing video!

  • @glumpfi
    @glumpfi 12 วันที่ผ่านมา +1

    I went through the same adventure :D I wrote a neural net from scratch in C++ just to get a deep understanding. The backpropagation part took me a while to figure out. I just got to an accuracy of 94% with MNIST, maybe because I still didn't implement optimizers and batches. Thanks for sharing :)

  • @nesquickyt
    @nesquickyt 21 วันที่ผ่านมา

    Your videos are really good and interesting 🔥

  • @LebaneseJesus
    @LebaneseJesus 20 วันที่ผ่านมา

    Instant sub, brilliant video

  • @weslycosta348
    @weslycosta348 10 วันที่ผ่านมา

    so underratted channel man, keep goin!

  • @sakchais
    @sakchais 18 วันที่ผ่านมา

    This is incredibly fun to watch!

  • @santiagogonzalez-hc1vp
    @santiagogonzalez-hc1vp 14 วันที่ผ่านมา

    What a great vid, new sub
    You summarize accurately two weeks of class of a ML Master where I didn't slept
    Great job doing that and understanding the fundamentals
    Ignore bad comments
    Keep the pace

  • @dhiraj6727
    @dhiraj6727 2 วันที่ผ่านมา

    The background music makes it look like this topic is just really really cool. Or maybe it's just because I have watched gaming videos with such bg music that similar bg music videos make me feel interested in the topic.

  • @adebayokehinde1580
    @adebayokehinde1580 16 วันที่ผ่านมา

    Making a tutorial is one thing and adding animations is 🔥Great!!

  • @suvojitsengupt
    @suvojitsengupt 8 วันที่ผ่านมา

    cool just subscribed , hope u will continue to post this kind of stuff, loved your work ...

  • @Yuzuru_Yamazaki
    @Yuzuru_Yamazaki 15 วันที่ผ่านมา +6

    "let's think of every neuron as a chef... Now , Let 'em cook 🗿" ahh explanation 😭

  • @akhildonthula6160
    @akhildonthula6160 15 วันที่ผ่านมา

    Please do more videos , i luv the way you do💥

  • @gabrielrock
    @gabrielrock 17 วันที่ผ่านมา

    awesome video bro, congrats! I’d like to see a video on how to go from that to a generative AI or a RAG, u know? Are u planning some video like that or do u got some reference?

  • @sqec
    @sqec 21 วันที่ผ่านมา +2

    Nice Video!

  • @TimStellar
    @TimStellar 18 วันที่ผ่านมา

    So good bro, so good!

  • @Avion_Animz
    @Avion_Animz 10 วันที่ผ่านมา

    Hey man I just discovered your channel and I love it and I've been wondering how long have you been do this programming I mean

  • @medakshchoudhary
    @medakshchoudhary 10 วันที่ผ่านมา +1

    loved the way how you explained all of this can you pleaseeeee make a beginner guide tutorial on how to get on to things like this exactly this type of things
    you got a sub

  • @familytharun3924
    @familytharun3924 17 วันที่ผ่านมา

    the video is too gud bro , u have taken lots of efforts..

  • @lelomator
    @lelomator 21 วันที่ผ่านมา

    Super amazing Video!

  • @Egon3k
    @Egon3k 21 วันที่ผ่านมา

    cool video :) keep on going

  • @user-on1mz7mp3i
    @user-on1mz7mp3i 3 วันที่ผ่านมา

    Bro love the video....
    Can you go in-depth in the coding part. Like every step by step code you used.

  • @avi_9243
    @avi_9243 15 วันที่ผ่านมา

    Ayo great stuff, tysm 😋

  • @commissariomontanaro2931
    @commissariomontanaro2931 20 วันที่ผ่านมา +11

    why does my acoustic pattern recognition match your avatar poses to Code Bullet?
    Nice video tho, now do it in C to assert dominance

    • @ericdanfunk3966
      @ericdanfunk3966 19 วันที่ผ่านมา +1

      This content creator is a clone of Code Bullets style 🥱

    • @Smurdy1
      @Smurdy1 18 วันที่ผ่านมา +3

      I think it's funny that I'm reading this while waiting for my C++ neural network to train. My code isn't even that optimized and my program only took about 5 minutes to go through the process of training a neuron 49 billion times. It's insane how much faster C++ is at machine learning (and everything) than python.

  • @jonathandyer6385
    @jonathandyer6385 43 นาทีที่ผ่านมา

    Description: "do not click on this: "
    Me: *Clicks it*
    Me: *Sees a youtube page that says: "are you sure you want to leave youtube?"*

  • @shashanklk7867
    @shashanklk7867 10 วันที่ผ่านมา

    pretty sick, bro 🔥

  • @sirosala
    @sirosala 9 วันที่ผ่านมา

    Excelente el video !!, muy bien explicado, brillante aporte de conocimiento. Saludos desde Rosario - Argentina.

  • @cen-t
    @cen-t 13 วันที่ผ่านมา

    Nice video, btw do you do live coding session ? Like on twitch or here on yt

  • @Sumii4242
    @Sumii4242 21 วันที่ผ่านมา

    Someone know what the muisc at 6:35 is?
    ah and also, this is seriousely such a cool video, great job mate!

  • @CastyInit
    @CastyInit วันที่ผ่านมา +1

    relu is just math.max(0,x), or just a fancy way of saying "if the output of a neuron is

  • @rnts08
    @rnts08 21 วันที่ผ่านมา

    Your code bullet is better than codebullet. You are what everyone hoped that he would be with his enigma video. Keep it up! 😂

  • @HarshPatel5940
    @HarshPatel5940 4 วันที่ผ่านมา

    Good video budd

  • @veifingtps4697
    @veifingtps4697 20 วันที่ผ่านมา

    I remember my first timw doing this went crazy because didnt want to do the math but wasnt so bad lol crazy good video and visualization

  • @kamaljeetsahoo4752
    @kamaljeetsahoo4752 10 วันที่ผ่านมา

    Quality of content 🔥💯

  • @johanlarsson9805
    @johanlarsson9805 8 วันที่ผ่านมา

    I did the same thing 15 years ago when I learned about ANNs... but in c++ and no numpy equivalent.
    Also, since I studier molecular biology at the time and was deep into evolution ofcourse i used a genetic alforithm instead of backpropegation. It all worked well in the end.

  • @Bigleyp
    @Bigleyp 11 วันที่ผ่านมา +1

    Now do if but where there are regions, a neuron has multiple connections and it is either on or off based on how many recent signals go into it.

  • @hariharan-wx9oq
    @hariharan-wx9oq 12 วันที่ผ่านมา

    just amazed!!!

  • @codeinvasion
    @codeinvasion 21 วันที่ผ่านมา +2

    Here before you get viral 🔥💯

  • @kshitij7791
    @kshitij7791 21 วันที่ผ่านมา +2

    Hey @Green-Code do you have any github repo for your work?

  • @gauravmandall
    @gauravmandall 13 วันที่ผ่านมา

    Amazing video

  • @hemanthkumarvaddipalli5
    @hemanthkumarvaddipalli5 9 วันที่ผ่านมา +1

    bro could u please give a complete road map to learn this stuff for beginners 🥺 . even from learning python

  • @AlexJNef
    @AlexJNef 13 วันที่ผ่านมา

    trying to explain machine learning without too much maths is like trying to explain a steak without knowing what meat is. amazing job dude, incredible simple explanation

  • @dhruva167
    @dhruva167 15 วันที่ผ่านมา

    Love the video

  • @EchoPrograms
    @EchoPrograms 21 วันที่ผ่านมา +2

    I just did the same thing a few days ago lol! (Also from scratch). I did it in JavaScript so I didn't even have numpy. My matrix class is like 150 lines long lol

  • @SlimeFroster
    @SlimeFroster 13 วันที่ผ่านมา +1

    Nice, 17k sub

  • @Concreteblockmachineug
    @Concreteblockmachineug วันที่ผ่านมา

    When you ask the network if it's a 7 or not..does it initially have a stored reference parameter set to compare with in order to know if the given photo is of a 7?

  • @pentasquare
    @pentasquare 19 วันที่ผ่านมา

    "Wait it's all maths?"
    "Always has been"

  • @wolfgangsell3233
    @wolfgangsell3233 21 วันที่ผ่านมา +1

    W video! Can you show off your code for the beginners here?

  • @user-pb2fp8el9q
    @user-pb2fp8el9q 21 วันที่ผ่านมา +1

    Amazing video bro. Nice explanation, truly 👍

  • @enerz9135
    @enerz9135 17 วันที่ผ่านมา

    Your Exaplation Is Very Simple It's Very Easy To Undertand

  • @sakethsaketh750
    @sakethsaketh750 19 วันที่ผ่านมา

    Cool video

  • @CatBoxOfficial
    @CatBoxOfficial 16 วันที่ผ่านมา

    Underrrated youtuber ngl

  • @johnyohan7022
    @johnyohan7022 6 วันที่ผ่านมา

    i have a physics end sem exam tmr, this has no connection to that yet here i am watching this

  • @helved807
    @helved807 21 วันที่ผ่านมา +1

    This was super interesting! Mainly because this is something I've wanted to do myself, but I've had some troubles implementing backpropagation. Any tips on how to implement it?

    • @SomethingSmellsMichy
      @SomethingSmellsMichy 21 วันที่ผ่านมา +2

      *Disclaimer: After writing an explanation, I assumed you understand calculus, but all the steps are broken down and the equations are solved fully.
      In this video, you were introduced to the loss function. This function does as it states: Quantifying the difference or incorrectness of your Neural Network.
      Ideally, you want your loss function to be 0 or close to 0. Assuming your loss function is either cross-categorical cross entropy (like in this video) or a more common approach: Mean Squared Error (Which has the formula 1/2(y-o)^2 where y is your expected output and o is your actual output).
      *In case you want to paste this into an appropriate calculator/document and because I've been practicing; here is the LaTex version of that equation \frac{1}{2}\left(y-o
      ight)^{2}.
      Since these functions are never negative, you can assume that your loss function reaches 0 at its absolute minimum (usually you'll hit a local minimum for complicated problems).
      For Instance, we can tell if a function "f" hit a minimum or a maximum by checking to see if f'(x)=0 (same as 0=d/dx f(x)) or if its slope is equal to 0.
      Since we know that if the slope is positive at a point as x goes towards infinity then it's rising. If the slope is negative at a point as x goes towards infinity then it's falling we can find the x position of the local minimum by iteratively changing x by x->x - f '(x) * lr (lr being the learning rate).
      X is usually represented as the input to the function, but since we want to change the weights and not the input, we can assume that the input to your network is constant and that the weights are the input.
      So since you've already done the forward pass, you likely already know that a network can be structured like this: o = f (W * x). Where o is the output, f is your activation function, W is the weights matrix, and x is your vector of inputs.
      Assuming you're using the Mean Squared Error, let's try to find the function's derivative with respect to your weights. The entire function is 1/2(y - f (W * x))^2.
      The chain rule tells us that we can find the derivative dE/dW (E being the cost function) by solving dE/do * do/dW (see how do cancels).
      *For the sake of visuals this is the LaTex equation: \frac{d \cdot E}{d \cdot o} \cdot \frac{d \cdot o}{d \cdot W}
      dE/do would be represented as (y - o) or (y - f(W * x)) and do/dW would be f '(W * x) * x. Meaning that dE/dW would be (y - f(W * x)) * f '(W * x) * x.
      So now you can update your weights with W -> W - dE/do. And for multiple layers, you need to pass dE/dx to the next layer which in this case is (y - f(W * x)) * f '(W * x) * W (Notice how you multiply by the weights instead of the input).

    • @SomethingSmellsMichy
      @SomethingSmellsMichy 21 วันที่ผ่านมา +1

      I'd recommend making a class that deals with a single layer of the network. The class should have a method called backward (for backward pass or backpropagation, but you can call it whatever you want) that takes in the error and the inputs for that layer. In the method, multiply the error by the derivative of your activation function and the inputs and add that result to your weights. The method should also return the error multiplied by the derivative of your activation function BUT then multiplied by your weights (preferably before you update them). If you want, I can give you some code to reference.

    • @UrFada
      @UrFada 20 วันที่ผ่านมา +1

      @@SomethingSmellsMichy I love your explaination Understood 70% of it even with little calculus and linear algebra background as am In grade 0 but started practicing calculus and linear algebra for making my own nerual network only part I don't really get is the end, and I find it hard to code something except I can fully visualize how it works so it has been troublesome trying to fully understand it

    • @SomethingSmellsMichy
      @SomethingSmellsMichy 20 วันที่ผ่านมา

      @@UrFada I can imagine that the end starts to become more complicated as it divulges more into symbols. I was kinda trying to wrap it up because of the character limit on replies.

    • @UrFada
      @UrFada 20 วันที่ผ่านมา +1

      @@SomethingSmellsMichy Ahh yes I will try to better visualize the comment later and maybe I will able to understand but overall thank you for the explaination

  • @rizamaeburlat8801
    @rizamaeburlat8801 4 วันที่ผ่านมา

    3:39 How the heck is that math, that's absolutly crazy i bet it will took people to solve this

  • @amunif_
    @amunif_ 18 วันที่ผ่านมา

    could you give a link to your code? github or colab notebook?
    I’ve learned the theoretical stuff for backpropagation and all the parital derivatives during my Masters but never coded them from scratch. Would like to learn how to apply all that math in code.

  • @TheFuture36520
    @TheFuture36520 11 วันที่ผ่านมา

    Bruh is implementing mathematical equations like bernoullis theorems and second order differential equations 😂.
    You're the best brother ❤️ 💙

  • @user-pk8wi3of6t
    @user-pk8wi3of6t 13 วันที่ผ่านมา

    Hello @Green code how did you did that...
    I am beginner in machine learning studying numpy, pandas and matplotlib
    How did you did that. I am also interested in data science but i am not finding exact path..
    Please guide bro

  • @afrateam6241
    @afrateam6241 20 วันที่ผ่านมา

    Only a genius could understand how genius you are . Wow 🎉

  • @DuDe_DuDe760
    @DuDe_DuDe760 4 ชั่วโมงที่ผ่านมา

    @Green-Code are you learning ML engineering from some university or something like that and If so then can you give us the rode map for it?

  • @codenediz
    @codenediz 21 วันที่ผ่านมา +1

    Nice Vid

    • @CrushedAsian255
      @CrushedAsian255 21 วันที่ผ่านมา

      google thinks this isn't english for some reason? maybe they need new neural network

  • @sarimshaikh5224
    @sarimshaikh5224 7 วันที่ผ่านมา +1

    Sir, u drop this sir 👑 , please wear it from next video

  • @MarshyMcOfficial
    @MarshyMcOfficial 20 วันที่ผ่านมา

    "its getting 40-50% accuracy it sucks"
    I know this seems bad greencode but you just taught a computer how to recognize things that we previously thought were only recognizable by humans. thats not bad. good job.

  • @eazypeazy8559
    @eazypeazy8559 20 วันที่ผ่านมา +13

    > from scratch
    > using bunch of libraries

    • @Superdeeep
      @Superdeeep 16 วันที่ผ่านมา

      I wanna see you do it without pytorch, tensorflow or a machine learning library

    • @eazypeazy8559
      @eazypeazy8559 16 วันที่ผ่านมา

      @@Superdeeep literally did it on C and finishing an article. good point here from u, now im thinking about a video

    • @hodayfa000h
      @hodayfa000h 16 วันที่ผ่านมา

      How long did it take you? I wanna try it but seems like a bad idea​@@eazypeazy8559

    • @StaniG420
      @StaniG420 15 วันที่ผ่านมา

      @@Superdeeep Every Data Science student who passes at least fourth semester is able to code NN from scratch, its not a crazy feat.

  • @Fatement
    @Fatement 8 วันที่ผ่านมา

    i thought you were going to do it in Scratch, the app with a fox thing 💀

  • @brahbah9349
    @brahbah9349 วันที่ผ่านมา

    Neural network *from* scratch: ❌
    Neural network *with* scratch: ✅

  • @Haven_Hue
    @Haven_Hue 6 ชั่วโมงที่ผ่านมา

    Subscribed 🤫

  • @stayhappy-forever
    @stayhappy-forever 18 วันที่ผ่านมา +4

    Can you open source the code if you don't mind? I worked on the same project but there are some improvements I feel like I can make.
    Edit: I implemented SGD and got 94%-95% accuracy, with 16 hidden neurons and only a singular hidden layer (10 epochs). is it possible if you can share your model architecture? Thanks!

    • @yds6268
      @yds6268 18 วันที่ผ่านมา

      Can you share yours? I know links in comments are impossible, but the repo name would be amazing

    • @stayhappy-forever
      @stayhappy-forever 18 วันที่ผ่านมา

      @@yds6268 Do you want the whole code? or do you just want an understanding/run through of what i made

  • @jakeboots
    @jakeboots 7 วันที่ผ่านมา +1

    damn boy you made it look so easy [it isn't]

  • @init_yeah
    @init_yeah 17 วันที่ผ่านมา

    So that's how you look without the Grey tv

  • @mkgamingdev
    @mkgamingdev 11 วันที่ผ่านมา

    Although I didnt understand any, but Editing is awsm

  • @peashooterman3
    @peashooterman3 20 วันที่ผ่านมา

    have you ever credited carykh for making lazykh since you seem to use it for basically every video

  • @xxxvedang
    @xxxvedang วันที่ผ่านมา

    i want to get a maths deep explanation video..pretty pleaaaaaseeeeee... :)

  • @barionlp
    @barionlp 17 วันที่ผ่านมา

    try giving your MNIST model your own hand drawn digits
    (then tell me again how good it is)

  • @roleben3009
    @roleben3009 19 วันที่ผ่านมา

    is there a notebook for this video?

  • @RWX0_Extra
    @RWX0_Extra 11 วันที่ผ่านมา

    You should create your own programming language and do a neural network with
    Thats what i call scratch

  • @Gordy-io8sb
    @Gordy-io8sb 21 วันที่ผ่านมา

    The song that plays near the beginning is a poppy version of Blue Monday. Lol.

  • @CrushedAsian255
    @CrushedAsian255 21 วันที่ผ่านมา +1

    7:13 hopefully your dataset doesn't contain any babies or Vietnamese emblems this time

  • @21boomin
    @21boomin 11 วันที่ผ่านมา

    Does someone know a mathematical course for this? i want to learn it in detail

  • @comosaycomosah
    @comosaycomosah 19 วันที่ผ่านมา

    Who let this man cook up a neural network

  • @lwangacaleb2729
    @lwangacaleb2729 12 วันที่ผ่านมา

    Now do it in assembly

  • @suryak3040
    @suryak3040 6 วันที่ผ่านมา

    @Green-Code where is the source code for this, and this is a nice tutorial keep continuing and little 👍 for your work.

  • @papierbndc
    @papierbndc 6 วันที่ผ่านมา

    4:00 let him cook,
    Let
    Him
    Cook

  • @brawldude2656
    @brawldude2656 20 วันที่ผ่านมา +1

    Backpropagation couldn't be explained simpler

  • @Garuda_Vigyan
    @Garuda_Vigyan 6 วันที่ผ่านมา

    Dear genius🎉great job
    Can u help to make capsule neural network for image process? It's my study to learn. Put one video with code link.

  • @mind6861
    @mind6861 21 วันที่ผ่านมา

    Can we have the code to go through

  • @charlotte80389
    @charlotte80389 21 วันที่ผ่านมา

    my mind hurts so damn much

  • @r4iden_
    @r4iden_ 10 วันที่ผ่านมา

    me watching this while working on my own ml assignment