Photonic Neuromorphic Computing: The Future of AI?

แชร์
ฝัง
  • เผยแพร่เมื่อ 13 มิ.ย. 2024
  • Photonic computing processes information using light, whilst neuromorphic computing attempts to emulate the human brain. Bring the two together, and we may have the perfect platform for next generation AI, as this video explores.
    If you like this video, you may also enjoy my previous episodes on:
    Organic Computing:
    • Organic Computing
    Brain-Computer Interfaces:
    • Brain-Computer Interfaces
    More videos on computing and related topics can be found at:
    / explainingcomputers
    You may also like my ExplainingTheFuture channel at: / explainingthefuture
    REFERENCES
    de Lima (2019) ‘Machine Learning With Neuromorphic Photonics’, Journal of Lightwave Technology, Vol.37, Issue 5 (March 7). Abstract and access options at: ieeexplore.ieee.org/document/...
    Feldman, J. et al (2021) ‘Parallel convolutional processing using an integrated photonic tensor core’, Nature Issue 589 (January 6). Abstract and access options at: www.nature.com/articles/s4158...
    Fourtané, S. (2021) ‘New Applications of Photonics for Artificial Intelligence and Neuromorphic Computing’, Medium (January 30). Available from: / new-applications-of-ph...
    Intel neuromorphic computing web pages: www.intel.co.uk/content/www/u...
    Intel silicon photonics web pages: intel.com/siliconphotonics
    Ríos, C. et al (2019) ‘In-memory computing on a photonic platform’, Science Advances Vol.2, No.5 (February 15). Available from: advances.sciencemag.org/conte...
    Sebastian, A. (2021) ‘Light and in-memory computing help AI achieve ultra-low latency’, IBM Research Blog (January 7). Available from: www.ibm.com/blogs/research/20...
    Sebastian, A. & Bhaskaran, H. (2019) ‘In-Memory Computing Using Photonic Memory Devices, IBM Research-Zurich Publications blog post (February 15). Available from: www.ibm.com/blogs/research/20...
    Shastri, B.J. et al (2021) ‘Photonics for artificial intelligence and neuromorphic computing’, Nature Photonics, Issue 15 (January 29). Abstract and access options at: www.nature.com/articles/s4156...
    The Engineer (2021) ‘Photonic processor heralds new computing era’ (January 7th). Available from: www.theengineer.co.uk/photoni...
    Chapters:
    00:00 Introduction
    00:57 Photonic computing
    05:18 Neuromorphic computing
    07:31 Neuromorphic photonics
    11:24 Future hardware
    #PhotonicComputing #NeuromorphicComputing #Photonics #ExplainingComputers
  • วิทยาศาสตร์และเทคโนโลยี

ความคิดเห็น • 638

  • @quimblyjones9767
    @quimblyjones9767 3 ปีที่แล้ว +107

    The way you are able to compress complicated ideas or concepts down into easily digestible bytes... It really is astounding.
    Thank you for making this channel.

    • @ExplainingComputers
      @ExplainingComputers  3 ปีที่แล้ว +18

      You're very welcome!

    • @Reziac
      @Reziac 3 ปีที่แล้ว +7

      And without dumbing it down, too. The mark of a great teacher.

    • @gusgone4527
      @gusgone4527 3 ปีที่แล้ว +4

      His manner and style is that of a very good British teacher. I bet he is even better with a live class of students giving real time feedback. So watch, listen and learn from a master.

    • @slimeball3209
      @slimeball3209 2 ปีที่แล้ว +1

      @@Reziac in fact when we understand something we are compress complicated ideas (like iron ore where is too many atoms, we dont need to store every single atom, we can just know what all in this piece of ore contains atoms and structures and they are repeating) to it's meaning.
      but in process of understanding we really will store in brain many "trash" data.

    • @Reziac
      @Reziac 2 ปีที่แล้ว

      @@slimeball3209 Tell me about it. My brain saves everything... and can't be arsed to index it. So when one datum is processed, piles of junk data swirl up from sludge...

  • @zebop917
    @zebop917 3 ปีที่แล้ว +112

    Intel’s choice of Loihi (Lo-eee-he) as a name is quite subtle. It’s presently an undersea volcano on the flank of the Big Island of Hawai’i and hidden to view but in the future it’ll be a giant island all of its own.

    • @ianchan2624
      @ianchan2624 3 ปีที่แล้ว +1

      u mean the pacific garbage patch's reclaimation:?

    • @synoptic1505
      @synoptic1505 2 ปีที่แล้ว

      It means, by the idea of petite androginous boy children

    • @maloxi1472
      @maloxi1472 ปีที่แล้ว

      @@ianchan2624 Nope, but you already knew that 🤷🏻‍♀

  • @aloysiussnailchaser272
    @aloysiussnailchaser272 3 ปีที่แล้ว +27

    This is exceptionally high quality stuff. It reminds me of what the OU used to broadcast on BBC2 in the 1970s. I used to get up in the early hours to watch it, before we had a VCR. Not just computing, but physics, English, maths, history, chemistry. Whatever was on.

    • @Pulsonar
      @Pulsonar 10 หลายเดือนก่อน +1

      Yep me too, Im a 70s kid, I derived a strange masochist pleasure from those dry OU lectures at night/early morning on BBC2 😂 This was well before the BBC micro show in the early 80s put domestic and school computing on the map in the UK, the great old days 😉

  • @jxchamb
    @jxchamb 3 ปีที่แล้ว +59

    Why do I have a feeling that this one is going to make my brain hurt a little bit? Man how I look forward to 9:00 a.m. EST on Sunday

    • @kevinshumaker3753
      @kevinshumaker3753 3 ปีที่แล้ว

      EDT

    • @jxchamb
      @jxchamb 3 ปีที่แล้ว

      @@kevinshumaker3753 Yes. I was forget to say EDT this time of year.

    • @gklinger
      @gklinger 3 ปีที่แล้ว +2

      If your brain hurts a little it is because you’re using it correctly.

    • @DarthVader1977
      @DarthVader1977 3 ปีที่แล้ว +3

      I like ketchup.

    • @jxchamb
      @jxchamb 3 ปีที่แล้ว

      @@DarthVader1977 Mayo for the win.

  • @johnphilippatos
    @johnphilippatos 3 ปีที่แล้ว +18

    Nothing better than watching a video on a difficult subject that provides its research sources. Shows that the creator of the video did his homework.

  • @Uniblab8
    @Uniblab8 3 ปีที่แล้ว +23

    I hope I live long enough to experience some of this. It's terribly exciting!

    • @leendert1100
      @leendert1100 3 ปีที่แล้ว +3

      terrible indeed, not exiting at all.

    • @Prakhar_Choubey
      @Prakhar_Choubey 3 ปีที่แล้ว +2

      @@leendert1100 It's completely alright if you can't understand it. It's rather complicated and requires some background. So no worries. But..... It's awesome!

  • @jk-ml5fb
    @jk-ml5fb 3 ปีที่แล้ว +28

    When we get to year 2030 Christopher will still not age one bit, then we realize he is computer generated all this time.

  • @Kevin-mx1vi
    @Kevin-mx1vi 3 ปีที่แล้ว +18

    When I saw the subject of this video I thought I could hear it whooshing clean over my head, but Chris explained it so well that I actually understood it !
    Dunno whether I'm more surprised at Chris or myself, but I definitely learned something today ! 😀

  • @lastinline1958
    @lastinline1958 3 ปีที่แล้ว +12

    This sounds a lot like the premise for "Terminator".

  • @duncanmcneill7088
    @duncanmcneill7088 3 ปีที่แล้ว +3

    Back in the late 70’s, when I studied electronics at University, we were imagining neural networks based on a self modifying, distributed type structure using a hybrid of analog and digital techniques. The technology didn’t exist back then.
    Digital processing took over and got us to the point we are today and now, 40 years later, the technology may be finally catching up.

  • @Grandwigg
    @Grandwigg 3 ปีที่แล้ว +6

    I love the way Chris stated his conclusions of 'not overtake, bit side by side von Newman arch. . . As best fits the use case' . I wish we'd see more of this with new technologies- rather than the unrestrained hype like with graphene, carbon nanotubes, teflon, and so on.
    Loved this video.

  • @madworld.
    @madworld. 3 ปีที่แล้ว +34

    Even Star Trek didn't dare such a name 😂😂😂
    passionating subject, indeed

    • @janglestick
      @janglestick 3 ปีที่แล้ว +3

      apparently, positronic networks = photonic neuromorphic networks + tasha yar

    • @saalkz.a.9715
      @saalkz.a.9715 3 ปีที่แล้ว +3

      Duotronic circuitry and the M-5 Multitronic unit (TOS)... Isolinear chips and Positronic neural network (TNG)... Synthetic Bio-neutral gel packs (Voy)... The Borg... Are we all a joke to you?

    • @victorarnault
      @victorarnault 3 ปีที่แล้ว

      I Loved your comment

    • @victorarnault
      @victorarnault 3 ปีที่แล้ว

      @@saalkz.a.9715 I liked your comment even more.

    • @MikaelMurstam
      @MikaelMurstam 3 ปีที่แล้ว

      @@janglestick no positronics use positrons which are positive electrons (anti-electrons).

  • @warrengibson7898
    @warrengibson7898 3 ปีที่แล้ว +2

    I’m reminded why this is my favorite channel on all of TH-cam. Tinkering with SBCs one week, looking at cutting-edge R&D the next.

  • @Crobisaur
    @Crobisaur 3 ปีที่แล้ว +6

    Working in my photonics lab in grad school I've always hoped to see the day computer architectures change to enable new usecases for photonic computing.

  • @thispandaispurple
    @thispandaispurple 3 ปีที่แล้ว +1

    Sunday morning coffee and an interesting explainingcomputers video to watch! Off to a great start today!

  • @jeraldgooch6438
    @jeraldgooch6438 3 ปีที่แล้ว +2

    Christopher,
    1. Please excuse the tardiness of this note. I watched the video on Sunday, but got waylaid by the world. Namely Mother's Day
    2. I can see why you would be a good and effective lecturer at university. You took a very complex topic and condensed it down to a number of relatively easily digestible bits and presented them effectively. In other words, you effectively dumbed down a complex topic without talking down to me. That is hard to do.
    3. So, it appears computers are making advances in three areas
    a. semiconductors go for ever smaller trace sizes and compressing more transistors into smaller and smaller spaces
    b. quantum computing is still being researched and on sees bits and pieces about advances in this area and some semi-practical devices
    c and no photonic computers
    d. (no doubt there is still someone out there still touting fluidics and the cure for what ails you)
    4. One wonders what programming languages (and operating systems) will look like for photonic and quantum computing devices will look like? Will the interfaces still be electronic with visual and touch devices or will there be more direct interface with the brain?
    5. When will we be seeing the SBC version of a photonic computing device from someone like Raspberry Pi? 😊

  • @CnCDune
    @CnCDune 3 ปีที่แล้ว +7

    I remember a movie about Time Travel and a photonic intelligence.
    "Time travel. Practical application."
    On that note, there's also a game featuring a von Neumann probe. Grey Goo, a fairly decent real-time strategy game.

    • @AlRoderick
      @AlRoderick 3 ปีที่แล้ว

      The von Neumann probe is a different concept then the von Neumann architecture, invented by the same guy (he was a prolific futurist). He posited the idea of self-replicating machinery, macro scale space probes that would go to other star systems, build copies of themselves, and send the copies forward to still more star systems. This process would repeat until you visited every single star system in the galaxy. Other later thinkers applied that same concept to nano scale machinery, which is where we get the grey goo idea that's the source of that game's name. Ironically a nanomachine would probably not use the von Neumann architecture.

  • @deechvogt1589
    @deechvogt1589 3 ปีที่แล้ว +1

    Chris, wow, okay, mind blown. I love taking a look ahead into the future of computing in all of its possible forms. Thanks for sharing this information.

  • @squelchstuff
    @squelchstuff 3 ปีที่แล้ว +3

    Brilliantly accessible coverage of the subject Christopher. Thanks also for your research sources too. Photonic Neuromorphic Computing is such a mouthful, so it's only a matter of time before the marketing bods come up with some strange acronym. PNC has a different (and depending on your proclivities/recidivism) meaning in the UK :)

  • @dxutube
    @dxutube 3 ปีที่แล้ว +3

    Very inspiring. Good to hear classic architecture will be carrying on for decades yet.

  • @walterig33
    @walterig33 2 ปีที่แล้ว

    Your videos truly are great. I thoroughly enjoy them, they are so well structured and informative. Thank you kindly. Greetings from Barcelona.

  • @danieldc8841
    @danieldc8841 3 ปีที่แล้ว +1

    This is really well-explained; you did a great job of explaining the key concepts and why they're important. Thanks for making this!

    • @meetoptics
      @meetoptics 2 ปีที่แล้ว

      Yes, MEETOPTICS team completely agree!

  • @TradersTradingEdge
    @TradersTradingEdge 3 ปีที่แล้ว +1

    Great video and starting point.
    Thanks very much!

    • @meetoptics
      @meetoptics 2 ปีที่แล้ว

      It is great!

  • @ludwigvanbeethoven61
    @ludwigvanbeethoven61 3 ปีที่แล้ว +1

    Wouldn't have expected that topic from you. Very cool. Thank you

  • @blevenzon
    @blevenzon 3 ปีที่แล้ว +1

    Brilliant stuff. Thank you so much

  • @aaronmatos5581
    @aaronmatos5581 ปีที่แล้ว +1

    Thank you, sir! So much golden info.

  • @peterbrandt7911
    @peterbrandt7911 3 ปีที่แล้ว +1

    Excellent overview, thank you ever so much.

  • @chriholt
    @chriholt 3 ปีที่แล้ว +1

    Chris, you never cease to amaze me with your in-depth research and very clear explanations of new technologies. Thanks as always!

  • @sinjhguddu4974
    @sinjhguddu4974 3 ปีที่แล้ว +1

    That was quite a head full! Thank you very much! Never knew about this at all.

  • @Colin_Ames
    @Colin_Ames 3 ปีที่แล้ว +3

    That was a little different from our usual Sunday morning fare. Interesting topic that certainly provides food for thought. Thanks Chris.

    • @ExplainingComputers
      @ExplainingComputers  3 ปีที่แล้ว +5

      I like to throw in a wildcard now and then. Next week we are controlling stuff with a Raspberry Pi Pico! :)

  • @akk5830
    @akk5830 ปีที่แล้ว +1

    Highly appreciated if u could explain tech in depth of photonoc neuromophics

  • @ObsidianMercian
    @ObsidianMercian 3 ปีที่แล้ว +1

    Thank you Chris for this incredibly informative video. I had already come across neuromorphic computing, but was unaware of photonic neuromorphic computing, so this information is greatly appreciated .

  • @TopHatCentury
    @TopHatCentury 3 ปีที่แล้ว +2

    Thank you very much for this amazing video! When I was taking a Cisco networking class a few years ago, I thought about how light-based computing was possible but I had a bit of trouble figuring it out in my mind. A few years later, here is a fantastic video explaining the concept of light-based computing. The times have certainly changed.

  • @Aditya-wb2uo
    @Aditya-wb2uo 3 ปีที่แล้ว +2

    The video was great. I always like stuff like this. Keep making great videos like this :)

  • @LarryKapp1
    @LarryKapp1 3 ปีที่แล้ว +1

    thanks for explaining a complex subject into something understandable.

  • @shyamasingh9020
    @shyamasingh9020 2 ปีที่แล้ว +1

    Thanks very much for explaining complex concepts in a crystal clear concise manner.

    • @meetoptics
      @meetoptics 2 ปีที่แล้ว

      We agree
      From MEETOPTICS team!

  • @MicrobyteAlan
    @MicrobyteAlan 3 ปีที่แล้ว +1

    Really interesting, informative and well presented. Great topic. Thanks

  • @TheOrganicartist
    @TheOrganicartist 3 ปีที่แล้ว

    excellent video. i've been following photonics & neuromorphic research for a while and the possibilities are exciting.

  • @marceloabreu5749
    @marceloabreu5749 3 ปีที่แล้ว

    Great work, Chris! Greetings from Brazil.

  • @joaothomazini
    @joaothomazini 2 ปีที่แล้ว +1

    Very interesting and well explained

    • @meetoptics
      @meetoptics 2 ปีที่แล้ว

      So good! We think so
      MEETOPTICS team,

  • @Tense
    @Tense 3 ปีที่แล้ว

    One of your best videos to date!

  • @jmsiener
    @jmsiener 3 ปีที่แล้ว +1

    Who provides sources for their videos? Chris does!! Really, thank you for that. I know there are other channels that do that but I think it raises the bar.

  • @DeepakSingh-qg4ri
    @DeepakSingh-qg4ri 3 ปีที่แล้ว +1

    Interesting topic. Great video!

  • @AlbertEspinRodriguez
    @AlbertEspinRodriguez 3 ปีที่แล้ว +1

    super interesting topic thank you

  • @tristanwegner
    @tristanwegner 2 ปีที่แล้ว

    Thanks for this very clear video and providing links to the papers

    • @meetoptics
      @meetoptics 2 ปีที่แล้ว

      It was super clear!
      MEETOPTICS team,

  • @PS_Tube
    @PS_Tube 3 ปีที่แล้ว +5

    I'm getting notifications at least 2 minutes later. But once the notification pop-up, I arrive here for my weekly dose of EC.

  • @hasanalpaslan8202
    @hasanalpaslan8202 3 ปีที่แล้ว +1

    Thanks for the very interesting and informative video. Both easy to understand and also high quality in presentation.
    On the other hand, the possibility of having such powerful machinery ending up on wrong hands is scary indeed.

  • @sunilkulkarni4426
    @sunilkulkarni4426 3 ปีที่แล้ว +1

    HI CHRISTOPHER !!! Unique topic indeed!!!

  • @ManyHeavens42
    @ManyHeavens42 2 ปีที่แล้ว +2

    Is true you're creating a Symphony ! Genius
    You Rock !

  • @srtcsb
    @srtcsb 3 ปีที่แล้ว +1

    I had to watch this one twice. The first time, I was having brunch and couldn't give this information my full attention. Alas, the gray matter is still reeling. I used to think the physical foundation (the silicon wafer/chip) needed to be changed because it's (basically) reached its maximum efficiency. As it turns out... It's the bus, stupid! Moving the data around is where the bottleneck is. Our computing devices might be going back to the beginning in ten or fifteen years: The Radio Shack Light Computer, The Commodore 64000 Quantalaser, The IBM Laser Jr. (made by Lenovo, of course). Thanks for another great video Chris.

  • @ritmo1130
    @ritmo1130 3 ปีที่แล้ว +1

    Great info! Had no idea of this.

  • @stephensu4371
    @stephensu4371 3 ปีที่แล้ว

    great work

  • @alandean2
    @alandean2 3 ปีที่แล้ว +1

    We already have the nearest computational mechanism to the human brain (and mind). It is called the thermostat that thinks just like the human brain by experiencing the subjective experience of reporting "Now it is too hot, now it is too cold"

  • @jogon1052
    @jogon1052 3 ปีที่แล้ว +3

    What an interesting subject, especially when IBM has released information on the 2 nanometre chip that it has been researching and is able to produce. Just imagine the trying to keep up with all of this future technology. Great video Chris and congrats on being able to keep up with all of the research you must have to do to keep up with all of this information.

  • @Mercyfon
    @Mercyfon 3 ปีที่แล้ว +1

    amazing video! very interesting, learned a lot here.

    • @meetoptics
      @meetoptics 2 ปีที่แล้ว

      So nice and well explained!

  • @OnTheEdgeNow
    @OnTheEdgeNow 3 ปีที่แล้ว +1

    Very interesting. Thank you.

  • @ElectroSwingable
    @ElectroSwingable 3 ปีที่แล้ว

    Nicely explained

  • @avejst
    @avejst 3 ปีที่แล้ว +1

    Great video and subject
    Thanks for sharing :-)

  • @sallienewton7184
    @sallienewton7184 3 ปีที่แล้ว +1

    This is one invaluable lesson. Thank you for your clear and concise explanation of photonic neuromorphic computing!

  • @zalves2000
    @zalves2000 3 ปีที่แล้ว +1

    Thank you for the extremely interesting video!

    • @meetoptics
      @meetoptics 2 ปีที่แล้ว

      Our team from MEETOPTICS agreed!

  • @1000left
    @1000left 3 ปีที่แล้ว

    WOW!!! That is a great video thank you!!!! It's a GREAT time to be alive!!!!!

  • @scdesign1565
    @scdesign1565 2 ปีที่แล้ว +1

    A character in a novel I am writing has such a brain. Ill let you know how it all works out! Very nice presentation of the concepts!

  • @guilherme5094
    @guilherme5094 3 ปีที่แล้ว +1

    Great video.

  • @tjeanneret
    @tjeanneret 3 ปีที่แล้ว

    Thank you. Great one this one !

    • @meetoptics
      @meetoptics 2 ปีที่แล้ว

      So good!
      Best regards from MEETOPTICS team,

  • @HeavenlyWarrior
    @HeavenlyWarrior 3 ปีที่แล้ว +1

    I miss this kind of subjects in your channel. Probably more adequate to your other channel?
    Very interesting content!

  • @bobwong8268
    @bobwong8268 3 ปีที่แล้ว +2

    👍👍👍👍👍
    Never failed to learn something new from you.
    Thank you Christopher!

  • @omarpasha9855
    @omarpasha9855 ปีที่แล้ว

    Very, very powerful application!

  • @LiamGunes
    @LiamGunes 3 ปีที่แล้ว +1

    I really love these types of videos. I get stuck into my own research, and a heads up to new things is great. I was thinking that I could sure use the bibliography for this video and viola, there it is. Thank you for the references. Those will help jumpstart any literature reviews.

  • @gaylancooper497
    @gaylancooper497 3 ปีที่แล้ว +1

    While this topic is way above my Knowledge, the way you explained the subject matter just blew my mind. I really didn't think I would understand it but I definitely have an understanding of this type of computing now. Thanks for the great explanation and another great video.

  • @AMDRADEONRUBY
    @AMDRADEONRUBY 3 ปีที่แล้ว

    Nice a new video as always I'm here hope you had a nice week thanks for quality contents.

  • @slimplynth
    @slimplynth 3 ปีที่แล้ว +28

    I always click like before it's even started :)

  • @adrianasilveira4561
    @adrianasilveira4561 2 ปีที่แล้ว

    Great explanation, thank you.

    • @meetoptics
      @meetoptics 2 ปีที่แล้ว

      Yeah! It is, definitely

  • @jhonbus
    @jhonbus 3 ปีที่แล้ว +20

    I can't be the only one who joins in the recitation of Chris's intro and outro for every video like I'm chanting some sort of religious creed?

    • @markwhidby5148
      @markwhidby5148 3 ปีที่แล้ว +2

      Certainly not!

    • @russwild8282
      @russwild8282 3 ปีที่แล้ว

      Me too :)

    • @jdelectro
      @jdelectro 3 ปีที่แล้ว

      Let's go and take... a closer look... at you 😜

  • @obiwanjacobi
    @obiwanjacobi 3 ปีที่แล้ว +1

    Now that was really interesting. Thank you!

    • @meetoptics
      @meetoptics 2 ปีที่แล้ว

      So good, definitely

  • @jeus2884
    @jeus2884 3 ปีที่แล้ว +4

    It is about time when I first started studying this type of technology before even became available 25 years ago when I thought of something like that I never thought that I would actually live long enough to see other people making the technology a reality

  • @LoporianIndustries
    @LoporianIndustries 2 ปีที่แล้ว +2

    I imagine Photonic Quantum Computing as Neural Net Architecture, in the progression toward the complete quantification of the human being like a large set of algorithmic logic, as a fluid metamaterial of nanotechnology, is going to lead to the eventual development of the Physical Spirit Being. You are a Human. You are a Machine. You are Fluid Nanotech Photonic Quantum Neural Human, and you are Human Energy that can generate yourself. You are Physical. You are Virtual. You are Living Data whose Sentience translates that Data into Human Information.

  • @cirotron
    @cirotron 3 ปีที่แล้ว

    VERY INTERESTING VIDEO, thanks!

    • @meetoptics
      @meetoptics 2 ปีที่แล้ว

      Yeah, sure!!

  • @rewindcat7927
    @rewindcat7927 3 ปีที่แล้ว +1

    Very interesting thank you!

  • @marksterling8286
    @marksterling8286 3 ปีที่แล้ว +2

    Great video Chris. I made me think about HPE The Machine, making a memory centred architecture with cpu around the outside. Also the work Steve Furber is doing with spinnaker (Parallel computer in with ARM chipping ). I think I need to catch up on all the research as I have not had a good look for about 3 years. Maybe my prediction of memristors being the next breakthrough will be partly correct.

  • @largepimping
    @largepimping 3 ปีที่แล้ว +1

    Very different from the typical content, but fascinating!

  • @lorderectus1849
    @lorderectus1849 3 ปีที่แล้ว +1

    Welcome to another video from Chris!

  • @saturno_tv
    @saturno_tv 3 ปีที่แล้ว +3

    Good Morning Mr. Barnatt. Here finally for the 10th 🥇 gold. Always supporting. Best tech stuff on internet is here. Many thanks. First.

    • @ExplainingComputers
      @ExplainingComputers  3 ปีที่แล้ว +1

      Thanks for your support -- 10th Gold medal awarded! :)

    • @williamhorton9763
      @williamhorton9763 3 ปีที่แล้ว +1

      @@ExplainingComputers Does Saturno live next door to you?

  • @meetoptics
    @meetoptics 2 ปีที่แล้ว

    Congratulations for introducing this topic so well in TH-cam.This platform let us spread all we know about the field and from MEETOPTICS we are proud to be part of the photonics community and to help engineers and researchers in their search for optical lenses through our site. We celebrate every step forward.

  • @edugio
    @edugio 3 ปีที่แล้ว

    This is one of the best tech videos I've ever seen.

  • @abba4468
    @abba4468 3 ปีที่แล้ว +1

    Excelente. Muchas gracias

    • @meetoptics
      @meetoptics 2 ปีที่แล้ว

      It is great!

  • @greggregory8311
    @greggregory8311 2 ปีที่แล้ว

    F----- Brilliant.Any start-up yet any where in world? Keep up the good work/research.Thanks
    Greg

  • @axlfire83
    @axlfire83 3 ปีที่แล้ว +1

    Very interesting my friend, thanks for the information

  • @joaocarlosalmeida7325
    @joaocarlosalmeida7325 2 ปีที่แล้ว +1

    Awesome video! Very inspiring for a Physics Student as myself, hope it inspires more people on the Technology sector! Great references as well!

    • @meetoptics
      @meetoptics 2 ปีที่แล้ว

      It is so good! It had inspire MEETOPTICS, definitely!

  • @UnniG
    @UnniG 2 ปีที่แล้ว

    🌹⭐️Amazing Information⭐️Thank you🌹

  • @TARS..
    @TARS.. 2 ปีที่แล้ว +1

    When you mentioned wavelength multiplexing in photonic hardware my mind blew a little.

  • @savirien4266
    @savirien4266 3 ปีที่แล้ว +1

    Some of our shortest wavelength UV lasers are in the 270nm range. That’s quite large compared to current transistor sizes. There’s a reason electron microscopes can image smaller objects than optical microscopes.

  • @krishnaaditya2086
    @krishnaaditya2086 3 ปีที่แล้ว +1

    Fantastic! Thanks

    • @meetoptics
      @meetoptics 2 ปีที่แล้ว

      It is,definitely!

  • @abzhuofficial
    @abzhuofficial 3 ปีที่แล้ว +1

    ExplainingTheFutureInComputers? 🤔😜
    Very educational video right here, Christopher. Great work as always. 🙂

  • @prashanthb6521
    @prashanthb6521 3 ปีที่แล้ว

    Wonderful research work sir, thank you for this presentation. I feel like Neuromophic computing might be the way to go for large data centers running intense cloud applications.

  • @alexhudspeth1213
    @alexhudspeth1213 3 ปีที่แล้ว

    This video is "required watching' for the Human Resistance against Skynet. Wonderful video, Chris; thanks!

  • @SJPretorius000
    @SJPretorius000 3 ปีที่แล้ว +1

    Oh yes, my fav channel, hey Chris!

  • @lawrenceallwright7041
    @lawrenceallwright7041 3 ปีที่แล้ว +12

    I'm just getting a sneaking feeling that my old PC with its built-in 5.25" floppy drive might be getting a bit out of date.

  • @sbc_tinkerer
    @sbc_tinkerer 3 ปีที่แล้ว +2

    Awesome deviation from the "normal" videos. Thank you sir for the glimpse into the future. Hope I live to see it. Must get those papers cited! 12 likes!!

  • @megatronDelaMusa
    @megatronDelaMusa ปีที่แล้ว +1

    kamasutra has evolved beautifully. A neuromorphic internet infrastructure would set the world ablaze. Our ability to build synthetic artificial synapses and dendritic branching would give a whole new meaning to the future. AGi has arrived on our blindside.

  • @benh9350
    @benh9350 3 ปีที่แล้ว

    The cutting edge of technology and future casting is always interesting! The story of how and when we get there is just as interesting. My sincere hope is that more good than bad comes from development. I’m sure the vast majority of happenings will be very positive. there is a nice kind of security in time delay due to hardware or processing times, but having things move quickly and fluidly is really nice too. So I’m looking forward to seeing what comes along. I’m really excited about using entanglement in communication/ information tech. I kind of wonder how far we will go.. I guess only time will tell.