Can you do an example, just showing how to find the threshold outside of any logic gate. That is,inputs are given, weights for each input is given and then we want to find the output. (No logic gate involved)
At 4:40 , you said that we would not consider the sign. I thought that we where counting the number of negative weights, that is, 1. Or are we supposed to sum the magnitudes of the negatives weights?
Only the value of yin corresponding to the case when both inputs are 1 will be greater than 0.For other cases it will be 0,0 and -2(assuming you are taking input values to be bipolar)
I understood the concept. But how are you deciding the weights? Which input gets excitatory and which one gets inhibitory weights? You just to 2 assumptions 1st (w1=1, w2=1) and the 2nd (w1=1,w2=-1) which worked. As Machine learning doesn't work on assumptions and it would take forever for it if there were 1000s of weights, I would like to know which algorithm will decide these weights and how. Please do make a video on that.
@@noelthomasbejoy3089 that will be (no of weights)^2 which will increase exponentially as no of weights increases. It's impossible compute such a large no of weights. There must be some other smart way to achieve this....🤔.
Be baptized by a man of GOD. Be blessed by a man of GOD that the HOLY SPIRIT enters into you. Repent from all sins and Never do them again. Pray constantly and read the Bible continuously. (Acts 3:13--20) Reach out to GOD daily to guide you into perfect righteousness. Love your enemies and genuinely pray for their righteousness (Matthew 5 vs 48). Read the Bible and obey JESUS CHRIST always and in all things. THANKS TO JESUS CHRIST. AMEN.
Hello Thank you for the better understanding but I had a query that I request you to resolve . For the implementation of NOR gate using McCulloch-Pitts Model can we have a different approach such as instead of replacing w11 and w12 can we perform this approach in order to gain the same proof such as Row 1: w11 = 1 ; w12 = 1 ; deta1 = 1 Row 2: w11 = 1 ; w12 = 1 ; deta1 = -0.5 Row 3: w11 = 1 ; w12 = 1 ; deta1 = -0.5 Row 4: w11 = 1 ; w12 = 1 ; deta1 = -2.5. Please!
in the first case, you cannot find a suitable threshold for which the o/p neuron fires only for input (1,0) (this is what we want the neuron to do to simulate AND-NOT logic). Any threshold you choose will make the neuron fire for all the possible inputs (this does not simulate AND-NOT logic).
Please try to teach in hindi more..m not saying because of language issue but while trying to teach in English u r just reading what is already written there!
Who's here one night before the exam👍👍😭😭
An hour before exam ._.
15 days before exam
Day before exam preparing for semester😥😥😥
Just 9 hrs before the exam but feeling sleepy so just commenting sleeping 😴
Instablaster.
Dude just explained the shit which my 60yr univ teacher can't in her entire career.
Real
Thank You So Much really explained in a systematic way...GOD BLESS YOU.
RIP at 3:43
Nee pwoli aanu mwuthe
I love this accent, but the explanation is even better
Thnk u so much, clear cut explaination with great examples!!
Man you are doing well like really well explained
Very Nice Explanation 👍
Thank you so much for clear explanation 👍
Can you do an example, just showing how to find the threshold outside of any logic gate. That is,inputs are given, weights for each input is given and then we want to find the output. (No logic gate involved)
excellent way of teaching ...............wonderful
You have written the truth table of XOR.
Good job!If you could have also provided a pdf file of this notebook ,that would ve been great
To the point man ,nice explanation 😉
What is k? Can you explain? In my book there is k. kw > theta > (k-1)w , like this?
Good job. Thank you.
thank you so much.... u r so good at explanation...
thank you so much
For the first case, If you set the value for theta >=2, then neuron will fire. Then why will we go for the second case by taking w1=1 and w2=-1?
Chetan Raj you need the neuron to be fired only for the case 1,0
@@gautamsardana4368 I know, But if we take the theta>=2,. Then also it will fire right?
@@chetanraj1950 it will fire, sure. But, it will fire only for 1,1. You want it to fire for 1,0 only since andnot of 1,0 is 1 and 0 for rest.
@@gautamsardana4368 Yes, I understood. Thank You.
beautifully explained
Nice video
Plz make a vedio using And OR function using MCp modle
Here in 4:41 how the value of theta become 1 ? Bcz theta = (2*1*(-1)) = 2 ?
you are considering only the magnitude of w and p....
i.e. w= |w| =|1| = 1
p=|p| =|-1| = 1
now nw-p = 2(1)-1 = 1
I NEED
USING McCulloch pit model nor and nand xor gate function urgent
thank you soo much!
thanks for helping
good handwriting
Thank you 😊
Do we use weights in McCullouch and Pitts neural network?
I guess ....NO we do not
At 4:40 , you said that we would not consider the sign. I thought that we where counting the number of negative weights, that is, 1. Or are we supposed to sum the magnitudes of the negatives weights?
thank you soo much very nice explanation :)
nicely explain..keep it up
Gud one and one more request can u make another video for XOR too?
Please suggest weights for AND implementation
You can assume both weights to be excitatory.
@@btechtutorialByNishantMittal but it won't provide solution as both excitatory will produce three Y in greater than 0 and won't give solution
Only the value of yin corresponding to the case when both inputs are 1 will be greater than 0.For other cases it will be 0,0 and -2(assuming you are taking input values to be bipolar)
I understood the concept. But how are you deciding the weights? Which input gets excitatory and which one gets inhibitory weights? You just to 2 assumptions 1st (w1=1, w2=1) and the 2nd (w1=1,w2=-1) which worked. As Machine learning doesn't work on assumptions and it would take forever for it if there were 1000s of weights, I would like to know which algorithm will decide these weights and how. Please do make a video on that.
Yeshwanth Reddy you have to try different combo of (1,-1),no fixed way
@@noelthomasbejoy3089 that will be (no of weights)^2 which will increase exponentially as no of weights increases. It's impossible compute such a large no of weights. There must be some other smart way to achieve this....🤔.
very well explained , keep up the good work
Thank you sir
Thanks bro. keep it up
U used wrong value of weights.For inhibitory inputs weight must be 1 and for excitatory it should be 0.
But how and not logic was performed 1,0 =1??
ANDNOT Gate calculates the function A and not B.Therefore it has a value of 1 when A is 1 and B is 0 and 0 for all other cases.
@@btechtutorialByNishantMittal means output is A Bbar
Fake
Be baptized by a man of GOD. Be blessed by a man of GOD that the HOLY SPIRIT enters into you.
Repent from all sins and Never do them again.
Pray constantly and read the Bible continuously.
(Acts 3:13--20)
Reach out to GOD daily to guide you into perfect righteousness. Love your enemies and genuinely pray for their righteousness (Matthew 5 vs 48).
Read the Bible and obey JESUS CHRIST always and in all things. THANKS TO JESUS CHRIST. AMEN.
The fuck ?
Can the threshold be negative and decimal for cases like NOT ?
I am watching this..and my exam is after 1 hour 😁
who is watching this video in lockdown.
Suggest weights for NOR gate.
Nice and satisfied, but model for logic, you drawn in last should have more informable.
how do you choose the weights for others examples. Are there any procedure??
You will have to assume weights and see if it fires neuron for a particular case.There is no fixed procedure.
Thank you
Hello Thank you for the better understanding but I had a query that I request you to resolve . For the implementation of NOR gate using McCulloch-Pitts Model can we have a different approach such as instead of replacing w11 and w12 can we perform this approach in order to gain the same proof such as Row 1: w11 = 1 ; w12 = 1 ; deta1 = 1
Row 2: w11 = 1 ; w12 = 1 ; deta1 = -0.5
Row 3: w11 = 1 ; w12 = 1 ; deta1 = -0.5
Row 4: w11 = 1 ; w12 = 1 ; deta1 = -2.5. Please!
Bhai itni jaldi kyo he Kahi Jana he kya 😂😂
I don't understand how the first case can't fire neuron
in the first case, you cannot find a suitable threshold for which the o/p neuron fires only for input (1,0) (this is what we want the neuron to do to simulate AND-NOT logic). Any threshold you choose will make the neuron fire for all the possible inputs (this does not simulate AND-NOT logic).
How is theta >= 1 wrong since it fires only those cases where it's supposed to fire?
@@krishnakrmahto97 I don't understand.
@@EntropyOnTheCrosshair in the first case , it will fire for cases other than what is desired for theta≥1..in the second case it's fine.
Still I have some doughts in this concept could you please solve
Mike McCullough - are you involved in this? Do you agree??
Thanks man
Nand gate ki truth table galat banayi hee aapne
nw+p hai bhai
Normal video bhi aisa lg rha h jaise 2x me chal rha ho
thanks bro
Do we have "w" in MP Neuron...???
Sorry,I didn’t get your question.
Weight from Xi node to Y
some people consider weights and some people don't
Good Explanation.But Could Have Been Better.
Please reduce the speed.
Hey! You can use the Playback speed option in settings and set it to 0.75...
what a noob😂
Watching it during the exam 😂😂
7 minutes before exam🥀💀
Sir low your speed a little
helal olsun aslan parçası
Thank you
Keep going
Who’s here one night before exam 🤣🤣
how can find value of p ?
assume it
Need more problems please
Very fast voice telling need high concentration to understand
You need a brain too.
@@partharanke5925 Good one xD LOL
easily solved by slowing down the playback speed.
Everything is perfect expect that pencil, please sharpen it
1 hrs before exam 😅
Just in online exam 🤣🤣
PLEASE PLEASE PLEASE ANSWER ITS REALLY IMPORTANT -
Can the threshold take negative values in mcculoch pitts neuron model ?
👍👍👍
why do you explain quickly.. are you basy!
KT fix
ECE 449
Please try to teach in hindi more..m not saying because of language issue but while trying to teach in English u r just reading what is already written there!
1 day 😂
Please explain it properly before solving question...
Wrong Truth table and rushing the explanation .
Bhai sirf notes mese read hi karna hota toh mai khud hi padh leta
Fraud