L-2 Activation Functions in Deep Learning

แชร์
ฝัง
  • เผยแพร่เมื่อ 31 ต.ค. 2024

ความคิดเห็น • 45

  • @exoticcoder5365
    @exoticcoder5365 ปีที่แล้ว

    Your teaching is so good ! Other tutorials from other channels are not as clear as yours ! Thank you 🙏🏻🙏🏻🙏🏻

  • @EngineeringSprits
    @EngineeringSprits หลายเดือนก่อน

    Mam I watched so many deep learning courses but your teaching is amazing from all of them.

    • @CodeWithAarohi
      @CodeWithAarohi  หลายเดือนก่อน

      @@EngineeringSprits Glad my videos helped you 🙂

  • @sonyvenugopal6295
    @sonyvenugopal6295 3 หลายเดือนก่อน

    so good....very nice explanation

  • @tanvirbadsha1629
    @tanvirbadsha1629 2 ปีที่แล้ว +1

    Great video. Loved the explanation.

  • @muhammadafzaalkhan9277
    @muhammadafzaalkhan9277 2 ปีที่แล้ว +1

    Great work mam.

  • @ghantaharshith504
    @ghantaharshith504 ปีที่แล้ว +1

    in unit / binary step activation slide there is a error represented as the threshold 0>x and xx and x>=0

  • @JwanKAlwan
    @JwanKAlwan 3 ปีที่แล้ว +4

    thank you for providing such in information but i have one note in slide no 3 the value of x in both equation is less than the threshold x>0 and 0

    • @hanae.health
      @hanae.health 2 ปีที่แล้ว +1

      I think thats an error in the slide.

    • @arnabpramanik5703
      @arnabpramanik5703 8 ชั่วโมงที่ผ่านมา

      yes, mistake from her side
      |

  • @aneerimmco
    @aneerimmco 4 หลายเดือนก่อน

    informative! thanks

    • @CodeWithAarohi
      @CodeWithAarohi  4 หลายเดือนก่อน

      Glad it was helpful!

  • @VIKASHKUMAR-rl3pn
    @VIKASHKUMAR-rl3pn 5 หลายเดือนก่อน

    softmax i am understand thank mam

  • @dipankarbarman6395
    @dipankarbarman6395 ปีที่แล้ว

    In slide 4 the function table is incorrect for -(ve) value. Overall Very good teaching video.

    • @CodeWithAarohi
      @CodeWithAarohi  ปีที่แล้ว

      Thankyou for letting me know about that typo. And glad you liked the video.

  • @devilzwishbone
    @devilzwishbone ปีที่แล้ว +1

    Is sigmoid (output of 0-1) as an analogue output, is this not equivelant to normalising data or a percentage of the input

    • @CodeWithAarohi
      @CodeWithAarohi  ปีที่แล้ว +1

      A sigmoid function is a mathematical function that maps any input value to a output value between 0 and 1. This is commonly used in machine learning and neural networks to represent probabilities or the likelihood of an event.
      While a sigmoid function can be used to normalize data or to represent a percentage of the input, it is not equivalent to these concepts. Normalization is a preprocessing step that is used to transform the values of a dataset to a common range, such as 0 to 1 or -1 to 1. This is typically done to make the data more suitable for use in machine learning algorithms.
      A sigmoid function, on the other hand, is a mathematical function that is used to model the relationship between the inputs and outputs of a system. It is commonly used in neural networks as an activation function, which determines the output of a neuron based on the inputs it receives.
      In summary, a sigmoid function is a mathematical function that can be used to model probabilities or likelihoods, but it is not the same as normalization or representing a percentage of the input.

  • @JayPatel-ou2ud
    @JayPatel-ou2ud 2 หลายเดือนก่อน

    Greta Greta thanks mam

  • @MATHSADDA4U
    @MATHSADDA4U 2 หลายเดือนก่อน

    Mam you said that sigmoid function have value between 0-1 but for sigmoid(-10) It had value grater than 1 why?

    • @radhagarg2518
      @radhagarg2518 หลายเดือนก่อน

      On that particular slide, the output is not fully written, Check the next slide where the sigmoid code is implemented, the output for sigmoid(-10) is 4.539............e^-05, which means it lies between 0 and 1.

    • @MATHSADDA4U
      @MATHSADDA4U หลายเดือนก่อน

      @@radhagarg2518 ok mam, thank u

  • @pifordtechnologiespvtltd5698
    @pifordtechnologiespvtltd5698 7 หลายเดือนก่อน

    👍👍

  • @arnabpramanik5703
    @arnabpramanik5703 12 วันที่ผ่านมา

    Mam can you provide the ppt or notes of these lectures? It will be more helpful for us

    • @CodeWithAarohi
      @CodeWithAarohi  11 วันที่ผ่านมา

      HI, I made these videos long time ago. I dont have the related ppts now.

    • @arnabpramanik5703
      @arnabpramanik5703 4 วันที่ผ่านมา

      @@CodeWithAarohi Mam pls make a new deep learning series with notes.

  • @prasanthdeventhiran4159
    @prasanthdeventhiran4159 8 หลายเดือนก่อน

    hi mam how i create my own datasets

    • @CodeWithAarohi
      @CodeWithAarohi  8 หลายเดือนก่อน

      For Image classification problems. Just create a folder with a name of "dataset" and then inside that folder create seperate folders for each class. For example- "class pen" and other class "pencil". Now download images from internet of pen and pencils and paste them in their corresponding folders.

  • @AYMENCHANNEL4U
    @AYMENCHANNEL4U 9 หลายเดือนก่อน

    Can I have your slides?
    please

  • @anuragshrivastava7855
    @anuragshrivastava7855 3 ปีที่แล้ว

    why leaky relu and Elu is not part of this video

    • @CodeWithAarohi
      @CodeWithAarohi  3 ปีที่แล้ว

      Sorry for that. I have covered leaky relu , mish activation functions with the algorithms in other videos.

  • @hanae.health
    @hanae.health 2 ปีที่แล้ว

    Maam can you please share the slides ?

  • @rafsanjane4309
    @rafsanjane4309 ปีที่แล้ว

    ❤️💚😘