This is an interesting research field I did not know the existence about up until now. I propose follow-up videos from Dr Phil Moriarty because this is too interesting!
It's so unconventional by current standards and yet when you think of first principles I think we see it's a welcome addition that should help cover a slew of problems that are ill-addressed by current systems.
@@paulleimer1218 Now that it's experimental, yes, although you can already find applicable problems in e.g. discrete maths. But if/when it becomes a real market, then re-thinking computation knowledge outside Von N. machines is likely to trigger whole new classes of common pipelines such as compilers or notions of Big-O complexity. It's bound to become a particular sub-domain within CS eventually if neuromorphic computing takes ahold. And I haven't even touched at the network science which is currently perhaps the poorest aspect in all of applied CS, being largely dealt with in stochastic rather than formally true paradigms (e.g. routing, still relying on tables at the end of the day, not pure logic).
This is really fascinating, but the parallels to neural circuitry were extremely overstated here. In an actual biological synapse, there's a giant division of varied proteins at work that does much more than simply tune the 'strength' of a connection in a straightforward way. Also: Single synapses often employ multiple transmitters; receiving synapses have differing receptors that respond differently to the same transmitter; synapses give themselves feedback about their own activity; different types of neurons have different patterns of baseline-firings in addition to action potentials etc. In more general terms: Nothing beyond the most primitive neural circuits is comprehensively and computationally 'understood'. Even though Prof. Moriarty talks about the lack of division between memory and 'processing' in humans, this is not exactly correct (there are dedicated 'memory areas' whose destruction or impairment has grave consequences for information storage and the capacity to learn, but they are not by themselves necessary or sufficient for the acquisition or recall of every type of memory). Nobody, in fact, can rigorously characterize where memory 'is' localized in the brain. But again, it is bound to be a combination of macroscopic regions, local synaptic information, temporal firing patterns etc. An informative video still, and I of course understand why these details and others were glossed over. But taking the neural analogy too serious, is kind of a bad idea in my opinion (at the current level of knowledge).
I was cringing as he described short-term plasticity and long-term potentiation. His blog post is also full of strange statements like "neurons combine (co-locate) memory and processing". Which is not actually true. That sort of question goes back to MuCulloch and Pitts who showed you actually have structures of connected neurons which can do logical processing together, not individually. The synaptic weighting between these units serving as memory of the system (Freeman).
I think that the implication is not that these devices exactly mimic the behavior of the brain, but that they can be "trained" as typical neural networks can in order to more closely mimic that. I would imagine during the process of training that there would indeed be separation and grouping of these small neuron-alikes to form larger structures. I do however have some major issues with CMOS being called energy hungry. It's only energy hungry because it can be, not because it has to be. Right now computing is based on flat chips with a limited surface area and a fixed cost, and one thing you can do to increase the computing power is to drive the circuits faster. Frequency is a large part of power consumption of chips, and part of the reason that phones are so power efficient is that they have multiple cores which run at 1 Ghz or less. GPUs are actually fantastically power efficient for their raw amount of performance because they have huge numbers of cores working in parallel at lower frequencies. Now imagine if you could fabricate an entirely integrated CPU with modern transistor density, not only in 2D, but in the full 3D volume equivalent of a human brain, and then have the coding to utilize that many parallel processors. You could drive them at 1 Mhz or less and still crush most other devices in highly parallel tasks. However, right now, that technology would be horrendously expensive and maybe even impossible with our technological progression. It will probably require nanoscale 3D printing to achieve anything close to that, which is fairly similar how brains are formed (in a broad sense).
"Memory" can mean several things, for example the memory of a event in the brain, which could be something like "data memory". Or it can mean: you remember how to drum, walk, talk, etc. This is more like "program memory". I think that "neurons combine (co-locate) memory and processing" refers to this "program memory"... the algoritms used to process data. Indeed is more like proccessing and memory are the same thing in neural nets. On the other hand, neural nets also store "data memory" as they are pattern recognition devices.
It’s almost as if he’s doing a 13 minute video and can’t completely explain a whole scientific field! What’s more he specifically says in the video that he will be leaving stuff out. Then absolute geniuses turn up in the chat to try to prove how smart they are by listing all the things he didn’t explain. Absolutely cringe.
A highly caffeinated Professor Moriarty is a joy to behold. He was going 100 miles a minute, stopped on a dime to answer you and then took off again without missing a beat.
I love this guy's energy and ability to explain such a vast and deep field to a common layman with no prior knowledge of these things, and without math at that. Amazing, please keep these videos going and give Phil more coffee 😂
I love listening to people talk about something they're passionate about... And I can see that he's passionate... I love this... I love learning from people like this...
Dr Moriarty is usually pretty passionate and energetic, but he's totally wired in this one! Caffeine is a hellofa drug. Love his passion for the field and his eloquence in breaking down complex topics.
Moriarty preaching about Neuromorphic computing gives me great hope, as I'm a physics student but I love concepts like these and am glad to know studying physics won't necessarily keep me away!
Now that NPUs are becoming mainstream and available in laptops/smartphones/GPUs, this video deserves an update because there's like ZERO videos other than this that describe how NPUs actually work. Prior to this all I knew is that somehow they leveraged Kirchoff's law, but seeing that memristor diagram put it into perspective.
Wow! Never came across such a fascinating research field , my mind is totally blown away. The way Dr Phil Moriarty explaining shows his passion and enthusiasm towards this field, thank you Dr Phil. What I observed is generally academia doesn't support such a path-breaking research, we need more private industrial research so that at least we can try and experiment such unique ideas and they won't have to worry about lack of funds. If possible please make more videos on this topic, thanks again for making such an awesome informative video :)
That Zeppelin shirt is fantastic! More videos with Phil!!! Also, nice to see info about the electrical hysteresis loop. Should make a computerphile vid on the history of the Law of Hysteresis discovered by Charles Proteus Steinmetz. It's EXTREMELY computing-relevant.
one additional aspect of the memristors ability to remember its past states via the vacancies in the synapses of the board, is the toffoli gate itself. Each input neuron and each output neuron on the board have their own ancilla bits inside their toffoli gate, which means its actual qubit that can help remember the last state of the synapse
please sequel! all those juicy materials too, Transition metal dichalcogenides (TMDC), OxRAM, CBRAM, Ferroelectric systems, floating gate transistors. then one more ep on short term plasticity, long term memory consolidation, Prof JJ Yang's short term plasticity temporal coding paper. mind blown.
Since 2017, I was a senior in Highschool, I have been trying to find some stuff related to the nervous system that can make a huge improvement for the management of information and conductivity of the brain. The answer is neuromorphic computing + memristors. I have been researching about the neuromorphic computing since that year and it's amazing the potential it has. Ofc, a lot of things are added with a lot of techniques. Brain machine interfaces non-invasive devices are gonna be the future of humanity in my opinion. This "artificial synapses" could be done by simulating the Central Nervous System (CNS) and somehow make the electrodes the memristors. And the question is how. One of my suggestions is using ultra sound waves controlled by EMG signals. Ultra sound waves can be change in direction and ofc the signal would not be perfect bcs it would be out of the skull the EMG device but there are electronic devices that can correct this with the help of AI. It's not easy bcs each neuron has a different behavior, work and location. So, it would be a huge thing to work it out in a neural network. Currently I'm doing my major in Biomedical Engineering but I know I'm a newbie on these type of things but I have an extreme interest on this neuromorphic computing.
Regular activation patterns lower the threshold, so your brain can compute after. Constant firings slow down the neurons to avoid oversensation, like when staring at a bright light, it gets darker. It's also down to chemical resources in the brain and that being "smart" is being energy(chemical) efficient.
It’s fascinating that the more we mimic nature with our technology the more advanced our technology gets, it’s almost as though nature itself is its own form of highly sophisticated technology.
Indeed, memristor technology is actually the product of a bootstrap paradox in which the technology was discovered from the remnants of a murderous robot from the future.
Nice video as usual with all these nothingam's channels :) so YES PLEASE MOAR on this ! I'm in Switzerland next to EPFL and i've heard about the Blue Brain Project but i didn't think others were so close to developping a memoristor
2:50 I don't like that you compared the energy usage of your brain recognizing a face to that of the hours it takes for a neural network learning to do the same. It would have been a better comparison to say the years it took your brain to develop those skills.
Some of us have a whole lifetime and still can't reliably tell two people apart, or reliably recognize someone if they change their hairstyle or put on a hat or similar... Otherwise, I found the presentation a bit too hand-wavy for my preferences.
@ ILYES I think the point still stands. Learning to bicycle takes way way less energy, than the current techniques to make a machine do that. But let's actually talk about the face recognition part - most people are born with that ability, it is not something you learn, so you can't compare that. Doing the task is comparable though. Doing facial recognizing by the brain is not even a whole watt, while GPUs take 100 watts to do the same. If you compare one single brain function to a very specific ASIC chip, then you can get better efficiency with that chip. But we know we can model/adapt our brain do to other things, and still be efficient. Therefore we know there is a better solution to more efficient computing than what we got right now.
Which would still be inaccurate. Because we do not spend every second of every day, employing the full capacity of our brain, developing that function. If you combined the effective time and "brain power" dedicated to improving facial recognition, it would be a very small fraction of our life.
I have applied to work under a professor on this topic.... Hopefully I'll get into this amazing world. This is an amazing field... Thq for uploading this video
Incredible video! Thank you very much! I was working in TiOx memristors at the University of Southampton and their potential is unbelievable! Shame they need a massive investment in order for them to be fabricated! Thank you guys! Brought me incredible memories!
yikes, thank you so much for bringing this more into the spotlight! I did research on memristors during grad school, neuromorphic computing, the physical side of deep learning, really is the next big thing.
I haven't read the paper, and I'm probably not going to it's way out of my field tbh, but I feel like an easier way to do this without requiring the physical movement of ions is to instead use something which changes either the electron configuration of D and F orbitals to change the conductivity or magnetic properties of the ions in the current path. So you could hypothetically have say an manganese atom, and so by sending it a little bit of energy (below the first ionization energy of the Mn(II) ion) you could read the resistance and see which state it's in. But if you increase the current you'll slowly start stripping electrons from it. So it starts at Mn(II) and each memory state is an electron configuration up to Mn(VI), giving 6 distinct memory states. These being Mn(II), Mn(III), Mn(IV), Mn(V) and Mn(VI), but since computers use a binary you could use Mn(II) and then whichever other ion best fits the currents and energies being used. So again when you go to read it you get different resistances based on the ions state. And the best part of this is that the state is easily re-writable, by increasing the energy to write the ion to its Mn(VII) state, you can then take advantage of the comproportionation reaction of Mn(II) and Mn(VII) plus a few electrons to reset all of the Mn ions into the 2+ state. I'm a biochemist, who's done some inorganic chemistry and I'm not sure this will work, i have no idea about the chemistry of computer chips. But I'm basing it on the water-splitting-complex from photosynthesis which kind of does this.
1st time I seen this guy.. What a legend the way he explains things laymens term is awesome.. Someone buy this man a coffee... I want to learn more from him..
If you think about it, we've had magnetic core memory technology which is even older than CMOS (which used magnetic hysterrsis) and the modern phase change memory technology that achieve a very similar result albeit they are a bit power hungry!
It sounds like the system designs a hardware algorithm for every "experience." And our lifelong collection of these algorithms is what's recognized as intelligence.
Love this!!! Just my vote, but please do more on Neuromorphic Computing. No CS or CE background, just an interested fan of the field, and I’ve long been interested in this subject. Thanks!
Exactly my current PhD topic, thanks for the video ;) 11:30 I'm not sure about the benefit of I/V non-linearity and asymmetry, it seems to be more an issue than a feature.
One can think of it as a fuse. But where the resistance slowly changes with time as we let current flow through it. If it is a 2 or 4 terminal device is though an important question. 2 terminal memresistors increases resistance as current flows in one direction, and reduces resistance as it flows in the other. (To measure the resistance without changing it, we use an AC signal instead.) While 4 terminal memresistors have one set of terminals for changing the resistance, and another set of terminals for using the reisistor. To a degree, a Field Effect transistor can be used as a 4 terminal memresistor, due to the fact that it has inherent capacitance and can therefor hold a charge on its gate, making it able to remember what state we left it in. Though, normally the gate is connected to the output of a transistor stage, that will force it to one of two possible values. But we can implement a floating gate that can be left to hold its charge. An example of this would be Flash memory cells. (There we use floating gates to store data, and then we can read it out later. In NAND Flash we also take the additional step of being able to bias the cells to turn on regardless of charge state of their floating gate, so that we can create a much more compact array of cells.)
Good video! My bachelor thesis was actually about converting traditional neural networks to spiking neural networks so that they work on neuromorphic hardware.
@@karlkastor Very interesting!! I'm about to choose my bachelor's thesis topic aswell. What topic would you recommend me? Have you heard of any more interesting than that? Because I haven't!! 😁
@@Hellas11 For my theses I usually ask multiple PhD students at my university if they can recommend a topic and then choose the most interesting one. For my master thesis, I'm currently trying to see what neural network architectures work best on small medical datasets. But if you wanna do something about converting normal neural networks to spiking neural networks, you might wanna look into binary neural networks or other types with few-valued activation (discrete neural networks?). These have the best shot to be more efficient to compute than traditional neural networks.
@@karlkastor Thank you for your valuable answer. Can binary neural networks or other types of networks with few valued activation work as alternatives of spiking neural networks? Maybe a benchmarking is needed?
I like how this field of research is a blend of chemistry, biology, electronic engineering and computer science
And later, when all this nonsense has stopped, perhaps we'll dig it up some day adding Archaeology to that list.
The future is more interdisciplinary and I like that =P
if you go down enough, biology becomes chemistry and if you go down even more, everything becomes physics.
@@Biped it's all applied math
@@rewrose2838 true. does that mean that being a mathematician qualifies you to be a doctor? :D
I'd love to hear more about this, on one condition: we get this man more coffee
And a helmet to protect his skull from the over-excited motor function. Although I guess he'll be first in line for a neuromorphic replacement.
I could listen to him talk all day.
It wasn´t coffee, It was Weed !!
yeah. couldn't find a course on TH-cam or anything
@@ivanguerra1260 When I drink coffee, people think I'm high on speed
This is an interesting research field I did not know the existence about up until now. I propose follow-up videos from Dr Phil Moriarty because this is too interesting!
Lex Fridman recently had a very interesting interview on this topic if you want something longer to listen to
Dr. Phil Moriarty sounds like a fictional character from a Japanese comic book in English...
@@phs125 He is a grand nephew of James Moriarty (if you know what I mean ;)
@@Matthew-kn7xi Thank you, I will definitely watch the video today!
I hope Dr. Phil Moriarty uses more Rush examples in the follow up videos
This is one of the most interesting, weird and cool areas of computer science in my opinion
It's so unconventional by current standards and yet when you think of first principles I think we see it's a welcome addition that should help cover a slew of problems that are ill-addressed by current systems.
This is legit the future!
Its really an area of computer engineering
@@paulleimer1218 Now that it's experimental, yes, although you can already find applicable problems in e.g. discrete maths. But if/when it becomes a real market, then re-thinking computation knowledge outside Von N. machines is likely to trigger whole new classes of common pipelines such as compilers or notions of Big-O complexity. It's bound to become a particular sub-domain within CS eventually if neuromorphic computing takes ahold. And I haven't even touched at the network science which is currently perhaps the poorest aspect in all of applied CS, being largely dealt with in stochastic rather than formally true paradigms (e.g. routing, still relying on tables at the end of the day, not pure logic).
This is physics, not CS
yes please go into more detail about neuromorphic computing
This is really fascinating, but the parallels to neural circuitry were extremely overstated here. In an actual biological synapse, there's a giant division of varied proteins at work that does much more than simply tune the 'strength' of a connection in a straightforward way. Also: Single synapses often employ multiple transmitters; receiving synapses have differing receptors that respond differently to the same transmitter; synapses give themselves feedback about their own activity; different types of neurons have different patterns of baseline-firings in addition to action potentials etc.
In more general terms: Nothing beyond the most primitive neural circuits is comprehensively and computationally 'understood'. Even though Prof. Moriarty talks about the lack of division between memory and 'processing' in humans, this is not exactly correct (there are dedicated 'memory areas' whose destruction or impairment has grave consequences for information storage and the capacity to learn, but they are not by themselves necessary or sufficient for the acquisition or recall of every type of memory). Nobody, in fact, can rigorously characterize where memory 'is' localized in the brain. But again, it is bound to be a combination of macroscopic regions, local synaptic information, temporal firing patterns etc.
An informative video still, and I of course understand why these details and others were glossed over. But taking the neural analogy too serious, is kind of a bad idea in my opinion (at the current level of knowledge).
I was cringing as he described short-term plasticity and long-term potentiation. His blog post is also full of strange statements like "neurons combine (co-locate) memory and processing". Which is not actually true. That sort of question goes back to MuCulloch and Pitts who showed you actually have structures of connected neurons which can do logical processing together, not individually. The synaptic weighting between these units serving as memory of the system (Freeman).
I think that the implication is not that these devices exactly mimic the behavior of the brain, but that they can be "trained" as typical neural networks can in order to more closely mimic that. I would imagine during the process of training that there would indeed be separation and grouping of these small neuron-alikes to form larger structures. I do however have some major issues with CMOS being called energy hungry. It's only energy hungry because it can be, not because it has to be. Right now computing is based on flat chips with a limited surface area and a fixed cost, and one thing you can do to increase the computing power is to drive the circuits faster. Frequency is a large part of power consumption of chips, and part of the reason that phones are so power efficient is that they have multiple cores which run at 1 Ghz or less. GPUs are actually fantastically power efficient for their raw amount of performance because they have huge numbers of cores working in parallel at lower frequencies. Now imagine if you could fabricate an entirely integrated CPU with modern transistor density, not only in 2D, but in the full 3D volume equivalent of a human brain, and then have the coding to utilize that many parallel processors. You could drive them at 1 Mhz or less and still crush most other devices in highly parallel tasks. However, right now, that technology would be horrendously expensive and maybe even impossible with our technological progression. It will probably require nanoscale 3D printing to achieve anything close to that, which is fairly similar how brains are formed (in a broad sense).
"Memory" can mean several things, for example the memory of a event in the brain, which could be something like "data memory".
Or it can mean: you remember how to drum, walk, talk, etc. This is more like "program memory".
I think that "neurons combine (co-locate) memory and processing" refers to this "program memory"... the algoritms used to process data.
Indeed is more like proccessing and memory are the same thing in neural nets.
On the other hand, neural nets also store "data memory" as they are pattern recognition devices.
It's all about Spheroidal Cows...
It’s almost as if he’s doing a 13 minute video and can’t completely explain a whole scientific field! What’s more he specifically says in the video that he will be leaving stuff out. Then absolute geniuses turn up in the chat to try to prove how smart they are by listing all the things he didn’t explain. Absolutely cringe.
A highly caffeinated Professor Moriarty is a joy to behold. He was going 100 miles a minute, stopped on a dime to answer you and then took off again without missing a beat.
I love this guy's energy and ability to explain such a vast and deep field to a common layman with no prior knowledge of these things, and without math at that. Amazing, please keep these videos going and give Phil more coffee 😂
Ray kurzweil predicted that we will achieve AGI by 2029, I think he is right
If you have the opportunity, then it would be awesome with more videos on this topic in the future.
Yes! Please do more videos about it :)
Vote +1 very interested
I love listening to people talk about something they're passionate about... And I can see that he's passionate... I love this... I love learning from people like this...
Dr Moriarty is usually pretty passionate and energetic, but he's totally wired in this one! Caffeine is a hellofa drug. Love his passion for the field and his eloquence in breaking down complex topics.
He is so enthusiastic about his research! I ❤️ his energy.
GIVE US MORE OF THIS! Seriously this is a fascinating area of research. I'd love to see many more videos about this topic.
This man is so excited, it's like he's on the verge of a breakthrough. Love his enthusiasm!
Drumming might not have been the best analogy. Drummers are usually working with fewer synapses :)
Is this a good musical joke or is this a sick burn ?
Haake, T., Garstka, M. et al. would like a word with you
The drummer might have locked his keys in the car, but it's the bass player that's stuck inside. 🔥
I always trust a man who's this excited and who likes John Bonham.
I had to watch the first two minutes twice, because I was geeking out on his shirt.
I could listen to this guy all day has such enthusiasm
I'd love more videos about this field! Super interesting and love the enthusiasm this guy has!
Dude, I have been waiting SOO long for someone to talk about neuromorphic computing on TH-cam.
This is the Dr. Phil I would rather watch.
Ok this was AMAZINGLY interesting and well explained. Please we need more on this field !
What about FPGAs?
I've been working on simulating memristors as artificial synapses and I can really feel this man excitement
Moriarty preaching about Neuromorphic computing gives me great hope, as I'm a physics student but I love concepts like these and am glad to know studying physics won't necessarily keep me away!
The Dr Phil actually worth watching.
People like that, approach and hyper passion like that, is what gives me chills. This stuff is fascinating. Thank you so much.
Now that NPUs are becoming mainstream and available in laptops/smartphones/GPUs, this video deserves an update because there's like ZERO videos other than this that describe how NPUs actually work. Prior to this all I knew is that somehow they leveraged Kirchoff's law, but seeing that memristor diagram put it into perspective.
Wow! Never came across such a fascinating research field , my mind is totally blown away. The way Dr Phil Moriarty explaining shows his passion and enthusiasm towards this field, thank you Dr Phil. What I observed is generally academia doesn't support such a path-breaking research, we need more private industrial research so that at least we can try and experiment such unique ideas and they won't have to worry about lack of funds. If possible please make more videos on this topic, thanks again for making such an awesome informative video :)
"You don't have a lot of liquid in a laptop"
Coffee has entered the chat.
Coffee is not exactly a liquid, but a mixture of various solutions and suspensions of solids.
As a material science masters I’m writing my thesis on neuromorphic computing this year. I’m insanely excited to say the least.
Memristors are useful for standard computing also so theres multiple funding reasons for their development
I always love Dr. Moriarty's energy and delivery.
I can feel Dr's Phil energy. It's Amazing.
Beautiful explanation. Imagine the fulfillment that engulfs researchers in this field; it must be intoxicating.😀
Yes.. Yes please do more of these
That Zeppelin shirt is fantastic! More videos with Phil!!! Also, nice to see info about the electrical hysteresis loop. Should make a computerphile vid on the history of the Law of Hysteresis discovered by Charles Proteus Steinmetz. It's EXTREMELY computing-relevant.
The fact that we can (potentially) mimic the function of a biological, wet, neurone using solid state matter is absolutely mind blowing
The most valuable video i had watched in 2021, Really! i felt like i'm not wasting my time.
I'd love if this could be a series. Also the way he introduce papers that might be interesting to check out
I love the enthusiasm of Dr Phil Moriarty! Inspiring! Keep on producing such awesome work!
one additional aspect of the memristors ability to remember its past states via the vacancies in the synapses of the board, is the toffoli gate itself.
Each input neuron and each output neuron on the board have their own ancilla bits inside their toffoli gate, which means its actual qubit that can help remember the last state of the synapse
As a second year physicist, this really helps me decide where to go and what to do...computer brains it is!
9:58 But this one goes to 11!
This sounds like one of the most important topics covered here, this is amazing! Would love to see more of this.
Very well presented and, as always, Dr Moriarty's infectious enthusiasm made me want to buy one (and I don't even know what it is)
I would definitely watch a series on this topic
please sequel! all those juicy materials too, Transition metal dichalcogenides (TMDC), OxRAM, CBRAM, Ferroelectric systems, floating gate transistors. then one more ep on short term plasticity, long term memory consolidation, Prof JJ Yang's short term plasticity temporal coding paper. mind blown.
Interesting field of research that I didn't even know existed. Would very much love to see more content on this!
Brady/Sean we need more videos on neuromorphic computing!!
Since 2017, I was a senior in Highschool, I have been trying to find some stuff related to the nervous system that can make a huge improvement for the management of information and conductivity of the brain. The answer is neuromorphic computing + memristors. I have been researching about the neuromorphic computing since that year and it's amazing the potential it has.
Ofc, a lot of things are added with a lot of techniques. Brain machine interfaces non-invasive devices are gonna be the future of humanity in my opinion. This "artificial synapses" could be done by simulating the Central Nervous System (CNS) and somehow make the electrodes the memristors. And the question is how. One of my suggestions is using ultra sound waves controlled by EMG signals. Ultra sound waves can be change in direction and ofc the signal would not be perfect bcs it would be out of the skull the EMG device but there are electronic devices that can correct this with the help of AI.
It's not easy bcs each neuron has a different behavior, work and location. So, it would be a huge thing to work it out in a neural network.
Currently I'm doing my major in Biomedical Engineering but I know I'm a newbie on these type of things but I have an extreme interest on this neuromorphic computing.
Wow! We need more of Dr. Moriarty!
Neurons that fire together, wire together.
Thanks for decomposing the memristor to its bare essence.
Please do more videos on this topic!!!
I just started graduate studies in computational neuroscience/machine learning! This is super cool and I want more videos!
Regular activation patterns lower the threshold, so your brain can compute after.
Constant firings slow down the neurons to avoid oversensation, like when staring at a bright light, it gets darker.
It's also down to chemical resources in the brain and that being "smart" is being energy(chemical) efficient.
The day we fully understand our brain will be the day we'll have the biggest jump in how to correctly design better computational architecture
It’s fascinating that the more we mimic nature with our technology the more advanced our technology gets, it’s almost as though nature itself is its own form of highly sophisticated technology.
Please Computerphile, do more of these video, this is a very exciting topic :)
Ok so this is equally fascinating and terrifying. Please give us more.
Isn't this what renowned computer scientist Miles Dyson was working on way back in the day?
Indeed, memristor technology is actually the product of a bootstrap paradox in which the technology was discovered from the remnants of a murderous robot from the future.
@@mushin111 Someone should make a movie about that!
Nice video as usual with all these nothingam's channels :) so YES PLEASE MOAR on this ! I'm in Switzerland next to EPFL and i've heard about the Blue Brain Project but i didn't think others were so close to developping a memoristor
So our consciousness is computational? confirmed? Also yes more videos please. You're one of the best channels on youtube.
This will have a big impact on everything.
Hi guys definitely would like to hear more about this topic.
Thanks and keep up the excellent work
10:20 I'm sure it's unvoluntary but I love how the box has the "Intel Inside ™" logo on the size and it's mimicking a memoristor ahahah
I absolutely want more of this
2:50 I don't like that you compared the energy usage of your brain recognizing a face to that of the hours it takes for a neural network learning to do the same. It would have been a better comparison to say the years it took your brain to develop those skills.
Some of us have a whole lifetime and still can't reliably tell two people apart, or reliably recognize someone if they change their hairstyle or put on a hat or similar... Otherwise, I found the presentation a bit too hand-wavy for my preferences.
@
ILYES I think the point still stands. Learning to bicycle takes way way less energy, than the current techniques to make a machine do that.
But let's actually talk about the face recognition part - most people are born with that ability, it is not something you learn, so you can't compare that.
Doing the task is comparable though. Doing facial recognizing by the brain is not even a whole watt, while GPUs take 100 watts to do the same.
If you compare one single brain function to a very specific ASIC chip, then you can get better efficiency with that chip. But we know we can model/adapt our brain do to other things, and still be efficient. Therefore we know there is a better solution to more efficient computing than what we got right now.
Which would still be inaccurate. Because we do not spend every second of every day, employing the full capacity of our brain, developing that function. If you combined the effective time and "brain power" dedicated to improving facial recognition, it would be a very small fraction of our life.
I have applied to work under a professor on this topic.... Hopefully I'll get into this amazing world. This is an amazing field... Thq for uploading this video
where?
Goodluck!🥰 This is a fascinating topic to study!
Please, more content on this topic!
This makes so much sense. Amazing. Please make more!!
Incredible video! Thank you very much!
I was working in TiOx memristors at the University of Southampton and their potential is unbelievable!
Shame they need a massive investment in order for them to be fabricated!
Thank you guys! Brought me incredible memories!
I was absolutely engrossed in this video because of the content... I need more... PLEASE??
Finally Neuromorphic computing is getting more recognition!!
This is fascinating, please more.
yikes, thank you so much for bringing this more into the spotlight! I did research on memristors during grad school, neuromorphic computing, the physical side of deep learning, really is the next big thing.
I definitely want to hear more about neuromorphic computing.
I haven't read the paper, and I'm probably not going to it's way out of my field tbh, but I feel like an easier way to do this without requiring the physical movement of ions is to instead use something which changes either the electron configuration of D and F orbitals to change the conductivity or magnetic properties of the ions in the current path. So you could hypothetically have say an manganese atom, and so by sending it a little bit of energy (below the first ionization energy of the Mn(II) ion) you could read the resistance and see which state it's in. But if you increase the current you'll slowly start stripping electrons from it. So it starts at Mn(II) and each memory state is an electron configuration up to Mn(VI), giving 6 distinct memory states. These being Mn(II), Mn(III), Mn(IV), Mn(V) and Mn(VI), but since computers use a binary you could use Mn(II) and then whichever other ion best fits the currents and energies being used. So again when you go to read it you get different resistances based on the ions state. And the best part of this is that the state is easily re-writable, by increasing the energy to write the ion to its Mn(VII) state, you can then take advantage of the comproportionation reaction of Mn(II) and Mn(VII) plus a few electrons to reset all of the Mn ions into the 2+ state. I'm a biochemist, who's done some inorganic chemistry and I'm not sure this will work, i have no idea about the chemistry of computer chips. But I'm basing it on the water-splitting-complex from photosynthesis which kind of does this.
Super interesting stuff; would definitely be interested in more of this field.
I’m currently doing my PhD in neuromorphic computing; it definitely is the way forwards.
Please do more on this field. I would definitely watch that
Damn, he hits his head often :D
He's finding carriage returns without line feeds in his inner monologue and has to supply them from the outside.
Very much like how neurotransmitters change probability of a neuron action potential propagating...wow...this is amazing...
3:45 Phil Moriarty: if computerphile viewers are interested in this we can do more videos on this.
me: we are!
1st time I seen this guy..
What a legend the way he explains things laymens term is awesome.. Someone buy this man a coffee... I want to learn more from him..
Definitely want more clips on this field!
If you think about it, we've had magnetic core memory technology which is even older than CMOS (which used magnetic hysterrsis) and the modern phase change memory technology that achieve a very similar result albeit they are a bit power hungry!
Please do more neuromorphic computing videos
I love your enthusiasm. Thanks for making this so interesting.
More in this topic!! This video was so hype
Phil reminds me so much of one of the monologues in the movie 'A Waking Life'
Great work!! Please, keep uploading contents about this subject
Yes! More videos on neuromorphic computing please!
It sounds like the system designs a hardware algorithm for every "experience." And our lifelong collection of these algorithms is what's recognized as intelligence.
Yes. More please.
Love this!!! Just my vote, but please do more on Neuromorphic Computing. No CS or CE background, just an interested fan of the field, and I’ve long been interested in this subject. Thanks!
Exactly my current PhD topic, thanks for the video ;)
11:30 I'm not sure about the benefit of I/V non-linearity and asymmetry, it seems to be more an issue than a feature.
One can think of it as a fuse.
But where the resistance slowly changes with time as we let current flow through it.
If it is a 2 or 4 terminal device is though an important question.
2 terminal memresistors increases resistance as current flows in one direction, and reduces resistance as it flows in the other. (To measure the resistance without changing it, we use an AC signal instead.)
While 4 terminal memresistors have one set of terminals for changing the resistance, and another set of terminals for using the reisistor.
To a degree, a Field Effect transistor can be used as a 4 terminal memresistor, due to the fact that it has inherent capacitance and can therefor hold a charge on its gate, making it able to remember what state we left it in. Though, normally the gate is connected to the output of a transistor stage, that will force it to one of two possible values. But we can implement a floating gate that can be left to hold its charge. An example of this would be Flash memory cells. (There we use floating gates to store data, and then we can read it out later. In NAND Flash we also take the additional step of being able to bias the cells to turn on regardless of charge state of their floating gate, so that we can create a much more compact array of cells.)
That is absolutely fascinating. I really would like to know more about these neuromorphic architectures and how they work.
My favourite lecturer on youtube by far.
Maybe it's because I'm a guitarist and play some drums.
Luv and Peace.
Good video! My bachelor thesis was actually about converting traditional neural networks to spiking neural networks so that they work on neuromorphic hardware.
So Izhikevich's work :) Very well known book.
@@wlmorgan Did he do any work about converting traditional networks trained using normal backpropgation into spiking networks?
@@karlkastor Very interesting!! I'm about to choose my bachelor's thesis topic aswell. What topic would you recommend me? Have you heard of any more interesting than that? Because I haven't!! 😁
@@Hellas11 For my theses I usually ask multiple PhD students at my university if they can recommend a topic and then choose the most interesting one. For my master thesis, I'm currently trying to see what neural network architectures work best on small medical datasets. But if you wanna do something about converting normal neural networks to spiking neural networks, you might wanna look into binary neural networks or other types with few-valued activation (discrete neural networks?). These have the best shot to be more efficient to compute than traditional neural networks.
@@karlkastor Thank you for your valuable answer. Can binary neural networks or other types of networks with few valued activation work as alternatives of spiking neural networks? Maybe a benchmarking is needed?