What is an RBM (Restricted Boltzmann Machine)?

แชร์
ฝัง
  • เผยแพร่เมื่อ 26 ต.ค. 2024

ความคิดเห็น • 46

  • @martinfunkquist5342
    @martinfunkquist5342 ปีที่แล้ว +27

    What is the difference between an RBM and a regular feed-forward network? They seem quite similar to me.

    • @kristoferkrus
      @kristoferkrus ปีที่แล้ว +18

      An RBM sends signals both "forwards" and "backwards" during inference, uses contrastive divergence for learning the weights and does not involve a loss function, while a feedforward network only sends signals forwards during inference and uses backpropagation and gradient descent for learning the weights, which requires a loss function. Besides, the RBM is energy-based (hence it has an energy function which can be said to be instead of a loss function) and follows (a simplified version of) the Boltzmann distribution (that doesn't include k and T), so it is stochastic, while a feedforward network isn't energy-based, but is instead deterministic.

    • @samcoding
      @samcoding 6 หลายเดือนก่อน +3

      Someone correct me if I'm wrong, but in a simpler, higher level view to what @kristoferkrus said, feedforward networks just take an input, pass to hidden layer(s) and produce an output. In that order and direction.
      RBMs take an input and pass to the hidden layers. Then the hidden layers pass it back to the input layers to generate the output.

  • @ahmedsowdagar9034
    @ahmedsowdagar9034 2 ปีที่แล้ว +9

    I was constantly searching for the examples of what is visible layer and hidden layer. This video explained me what it is. Thanks

  • @nemeziz_prime
    @nemeziz_prime 2 ปีที่แล้ว +6

    It'd be great if IBM could make a dedicated deep learning playlist consisting of videos such as this

  • @Tumbledweeb
    @Tumbledweeb ปีที่แล้ว

    I have indeed made a decision; The one that brought me to this video here! I'm here looking for photos of any of these Boltzmann Machines.

  • @erikslorenz
    @erikslorenz 2 ปีที่แล้ว +7

    I am incredibly motivated to build a light board

  • @vgreddysaragada
    @vgreddysaragada ปีที่แล้ว +1

    You made it simple..elegant presentation..Great work..Thank you..

  • @INSIDERSTUDIO
    @INSIDERSTUDIO 10 หลายเดือนก่อน +2

    No one notice but man writing in reverse 😮 how hard he train to do that 🔥🙌

    • @IBMTechnology
      @IBMTechnology  10 หลายเดือนก่อน

      See ibm.biz/write-backwards for the backstory

    • @Abhilashaisgood
      @Abhilashaisgood 4 หลายเดือนก่อน

      i think they mirrored the videoo, its really cool write smthing in a glass then open your selfie cameraa , from back camera they look reversed and form front camera they lookk samee!

  • @quocanhnguyen7275
    @quocanhnguyen7275 ปีที่แล้ว +17

    So bad, doesn't say anything. THis is just any neural network

  • @svanvoor
    @svanvoor 10 หลายเดือนก่อน

    Either (a) this guy is very good at mirror writing, or (b) they mirrored the video after recording. Given he's writing with his left hand, and given the .9 probability of right-handedness as a Bayesian prior, I assume P(b)>P(a).

  • @siddharthagrawal8300
    @siddharthagrawal8300 2 ปีที่แล้ว +3

    This just sounds like a neural network without any output?

  • @pavanpandya9080
    @pavanpandya9080 ปีที่แล้ว

    Beautifully Explained. Thank you for the Video!

  • @amishajain3400
    @amishajain3400 2 ปีที่แล้ว

    Beautifully explained Thank You!

  • @tanishasethi7363
    @tanishasethi7363 2 ปีที่แล้ว

    i love how he's smiling throughout the vid

  • @freespam9236
    @freespam9236 ปีที่แล้ว

    watching "AI Essentials" playlist - no recommendations engine in the play right now

  • @oualda12
    @oualda12 2 ปีที่แล้ว +2

    Thanks for the video, I'am new in this domain, I want to ask if the RBM have only two layers (visible and hidden) how can we get the output of this RBM, should we add an output layer to get result or what?
    thank you again.

  • @high_fly_bird
    @high_fly_bird 2 ปีที่แล้ว

    Charismatic speaker! But I think the theme of hidden layers is not clear enough - hidden layers usually are not interpreted. Maybe u were talking about hidden layers? And it woulb be cool if you actually make an example of WHAT exactly is passed to the visible layer. Numbers? Which numbers?

  • @mathewssaiji5149
    @mathewssaiji5149 2 ปีที่แล้ว

    waiting for your recommendation system and explainable recommendation system videos

  • @hitarthpanchal1479
    @hitarthpanchal1479 2 ปีที่แล้ว +4

    How is it different from standard ANNs

    • @MartinKeen
      @MartinKeen 2 ปีที่แล้ว +3

      Thanks for watching Hitarth. Basically the thing that makes an RBM different to a standard artificial neural network is the RBM has connections that go both forward and backwards (the feed forward pass and feed backward pass) which makes an RBM very adept at adjusted weighting and bias based on observed data.

    • @andreaabeliano4482
      @andreaabeliano4482 2 ปีที่แล้ว

      At very high level, one main difference is that ANNs typically are classifiers, they need labels, also to train and get the weights of the edges.

    • @apostolismoschopoulos1876
      @apostolismoschopoulos1876 2 ปีที่แล้ว

      @@andreaabeliano4482 using RBMs, we are not interested on the weights of the edges? Aren't the final weights the probabilities that after someone watches video A will watch video B? Am I understanding this correctly?

    • @Scriptum_1
      @Scriptum_1 2 ปีที่แล้ว

      @@apostolismoschopoulos1876 Similar to ANN, In RBM we're *TOTALLY* interested on adjusting the weights and biases. And yes, a trained net (weights, biases) will tell us about the probabilities of visible units after sampling.

    • @Scriptum_1
      @Scriptum_1 2 ปีที่แล้ว +4

      As someone told, the main difference is that ANN is supervised learning with targets to predict, while RBM is an unsupervised learning method. Other differences:
      Objective: ANN -> learns a complex function, RBM -> learns a probabilty function
      What does: ANN -> predicts output, RBM -> estimate probable group of variables (visible and latent)
      Training algorithm: ANN -> backpropagation, RBM -> contrastive divergence
      Basic principle: ANN -> decreases a cost function, RBM -> decreases an energy function (probability function)
      Weights and biases: ANN -> deterministic activation of units, RBM -> stochastic activation of units

  • @bzqp2
    @bzqp2 2 ปีที่แล้ว +1

    Wait. So are the weights summed up to activate the nodes in the hidden layer or does the sum represent the probability of activating of a node?

    • @Scriptum_1
      @Scriptum_1 2 ปีที่แล้ว

      Both. The weights and biases are used to estimate the hidden units sampling p(h|v) or to estimate the visible units sampling p(v|h). So although we can speak about activation of units it is not a deterministic process. And in the other hand the weights and biases, along with hidden and visible units, are used to calculate the energy of the system, which is considered as probability as well.

  • @alyashour5861
    @alyashour5861 11 หลายเดือนก่อน +1

    How is bro writing backwards perfectly the whole time

    • @IBMTechnology
      @IBMTechnology  11 หลายเดือนก่อน

      See ibm.biz/write-backwards

  • @akaashraj8796
    @akaashraj8796 10 วันที่ผ่านมา

    how is he writing backwards?

  • @abdulsaboor2168
    @abdulsaboor2168 5 หลายเดือนก่อน

    How its different from simple ann with backprotogation??

  • @vishnupv2008
    @vishnupv2008 2 ปีที่แล้ว +1

    In which Neural network is nodes on a given layer is connected to other nodes in the same layer?

    • @abdulelah.6999
      @abdulelah.6999 9 ชั่วโมงที่ผ่านมา

      Boltzmann Machine (BM)

  • @smallstep9827
    @smallstep9827 2 ปีที่แล้ว +1

    sample?

  • @CynthiaWhite-l6v
    @CynthiaWhite-l6v 27 วันที่ผ่านมา

    Turcotte River

  • @ScottSchwartz-l4t
    @ScottSchwartz-l4t หลายเดือนก่อน

    Carroll Lakes

  • @CarolRobinson-f8s
    @CarolRobinson-f8s 25 วันที่ผ่านมา

    Dooley Freeway

  • @KimberlyBrown-k4j
    @KimberlyBrown-k4j หลายเดือนก่อน

    Adalberto Courts

  • @AnnaPerez-e1k
    @AnnaPerez-e1k หลายเดือนก่อน

    Halie Coves

  • @danielebarnabo43
    @danielebarnabo43 2 ปีที่แล้ว

    What the hell? Is this what you do when you're not brewing?

  • @PunmasterSTP
    @PunmasterSTP 4 หลายเดือนก่อน

    Restricted Boltzmann Machine? More like "Really cool network that's just the thing!" 👍

  • @zzador
    @zzador 4 หลายเดือนก่อน

    You not really knowing what you talking about. The weights are just weights and NOT probabilities. The summed activity of a unit fed through the sigmoid function is the activation probability of a unit in an RBM and definetely NOT the weights.

  • @Tom-qz8xw
    @Tom-qz8xw 6 หลายเดือนก่อน

    Terrible explanation, whats the point of making these videos if you dont show equations, you didnt mention KL divergence or anything technical, practically useless

  • @flor.7797
    @flor.7797 ปีที่แล้ว

    😂