Beating Moore's Law: This photonic computer is 10X faster than NVIDIA GPUs using 90% less energy

แชร์
ฝัง
  • เผยแพร่เมื่อ 22 พ.ค. 2024
  • Moore's Law is dead, right? Not if we can get working photonic computers.
    Lightmatter is building a photonic computer for the biggest growth area in computing right now, and according to CEO Nick Harris, it can be ordered now and will ship at the end of this year. It's already much faster than traditional electronic computers a neural nets, machine learning for language processing, and AI for self-driving cars.
    It's the world's first general purpose photonic AI accelerator, and with light multiplexing -- using up to 64 different colors of light simultaneously -- there's long path of speed improvements ahead.
    Links:
    TechFirst transcripts: johnkoetsier.com/category/tec...
    Forbes columns: www.forbes.com/sites/johnkoet...
    Subscribe to the podcast: anchor.fm/techfirst
    Keep in touch: / johnkoetsier
  • วิทยาศาสตร์และเทคโนโลยี

ความคิดเห็น • 2.7K

  • @WeeHee
    @WeeHee 2 ปีที่แล้ว +5672

    Now we can literally say that RGB improves performance.

  • @jackwright7014
    @jackwright7014 2 ปีที่แล้ว +2603

    I've learned with a lot of startups that it is mostly talk. I'll wait until they've actually got something.

    • @lanrebloom7030
      @lanrebloom7030 2 ปีที่แล้ว +43

      I mean this has nothing to do with that. Its the technology that should be your concern here.

    • @KanedaSyndrome
      @KanedaSyndrome 2 ปีที่แล้ว +444

      Agreed. He says that you can "actually buy it" and yet when asked for the price, there is no answer, meaning that you can't actually buy it when it comes to it.

    • @owenb6499
      @owenb6499 2 ปีที่แล้ว +158

      @@KanedaSyndrome he even said it’s not gonna be able to run windows, definitely going to be a very specialized research/industry chip.

    • @owenb6499
      @owenb6499 2 ปีที่แล้ว +72

      @XxTreXXockxX good one Einstein, a consumer gpu is setup to run with windows. You gotta make a make your own specialized version of linux for a supercomputer, do the math genius.

    • @mattizzle81
      @mattizzle81 2 ปีที่แล้ว +205

      I concur. I've seen so much vaporware in my life it's not even funny. If I only had a dollar for every breakthrough that is hyped up and never seen the light of day.
      Then on the flip side things that I've never heard about suddenly revolutionize everything.

  • @Justin_80
    @Justin_80 2 ปีที่แล้ว +840

    Just when you thought regular GPU's were getting too expensive.

    • @pszyk.o.9074
      @pszyk.o.9074 2 ปีที่แล้ว +23

      So now their going to be replaced

    • @thehotdogman9317
      @thehotdogman9317 2 ปีที่แล้ว +15

      Nvidia better start taking notice.

    • @ashflame6888
      @ashflame6888 2 ปีที่แล้ว +51

      @@thehotdogman9317 LOL you people are funny..... This is still 10+ years out.

    • @SpenserRoger
      @SpenserRoger 2 ปีที่แล้ว +13

      @@ashflame6888 who says

    • @ashflame6888
      @ashflame6888 2 ปีที่แล้ว +20

      @@SpenserRoger Intel, AMD and anybody else who knocks on the door of this company with a billion dollars.... OR the simple fact that you are dumb enough think tech actually will EVER move that fast.... Yeah lets go from 8 core 5ghz processors to 40 core 20ghz lightspeed processors overnight! SURE THAT WILL HAPPEN!!! Turn your brain on before you open your mouth again.... I wouldnt be surprised if this NEVER happens and I also wouldnt be surprised that if it does happen its still 20 years out, or more. Its taken 20 years just go from single core 1ghz to 8 core 5hgz..... and you think we just gonna quadruple it overnight? LOL yeah....

  • @anonanon5147
    @anonanon5147 2 ปีที่แล้ว +496

    12:49 Just a small correction, Moore's law is not about doubling of performance, instead it is doubling of the number of transistors.

    • @sidharthghoshal
      @sidharthghoshal 2 ปีที่แล้ว +55

      That used to mean doubling of performance, at one point, smaller transistor = less power AND faster speeds AND more storage. nowadays the only part of that which remains true is more storage.

    • @robertfarquhar2780
      @robertfarquhar2780 2 ปีที่แล้ว +18

      @@sidharthghoshal Another reason chips get so hot is because they run at higher frequencies now, You can overclock intel processors up to like 5.4 ghz and they're stable.

    • @sierra5065
      @sierra5065 2 ปีที่แล้ว +9

      @@robertfarquhar2780 11900k can hit 5.6ghz with a regular water cooler. Granted its draws double the power compared to Ryzen but it's still cool.

    • @Leo99929
      @Leo99929 2 ปีที่แล้ว +26

      Also, CPU's are roughly 100x more efficient in the last 15 years, and 22x more powerful. He is just straight up wrong about this imaginary stagnation. Though we are approaching a physical limit in transistor minimum size, so processors will start getting bigger like AMD Epyc range, or we won't be able to fit more transistors per unit area using electricity due to quantum tunneling issues when the tech reaches the

    • @edwardcoyle5425
      @edwardcoyle5425 2 ปีที่แล้ว +2

      The concept from moors law has expanded to include processing power and then to overall performance.

  • @Dia1Up
    @Dia1Up 2 ปีที่แล้ว +2125

    I feel like we missed an important question: But can it run Crysis?

    • @DivineWerezwolf
      @DivineWerezwolf 2 ปีที่แล้ว +122

      Thats outdated thinking! its all about Cryptocurrency now, Can it do the crypto mining?

    • @spr_
      @spr_ 2 ปีที่แล้ว +180

      @@DivineWerezwolf NO I DONT THINK SO

    • @bigrat6030
      @bigrat6030 2 ปีที่แล้ว +145

      @@DivineWerezwolf yeah no one cares

    • @kikiriski6017
      @kikiriski6017 2 ปีที่แล้ว +74

      @@DivineWerezwolf
      ABSOLUTELY NO

    • @skmetal7
      @skmetal7 2 ปีที่แล้ว +104

      It can run crysis in crysis, while compiling the code for crysis.

  • @vkoskiv
    @vkoskiv 2 ปีที่แล้ว +660

    'General purpose AI accelerator'
    Is that what we're calling matrix multiplication now?
    Still a big deal of course, but I find marketing speak funny.

    • @Apocalymon
      @Apocalymon 2 ปีที่แล้ว +54

      That's how you market to C-suite folks

    • @aleksanteri_r
      @aleksanteri_r 2 ปีที่แล้ว +94

      Matrix is so '90s. AI must be included, that's a standard, but when you add "general", it hints at AGI and the investors start throwing money at you.

    • @3moluvr
      @3moluvr 2 ปีที่แล้ว +3

      @@aleksanteri_r LOL

    • @causalitymastery269
      @causalitymastery269 2 ปีที่แล้ว +15

      @@aleksanteri_r ridiculously on-point.

    • @v1Broadcaster
      @v1Broadcaster 2 ปีที่แล้ว +3

      Nah it’s been like that for a while lol

  • @leoshaju4259
    @leoshaju4259 2 ปีที่แล้ว +168

    John: Interesting
    Ceo lightmatter: yeah so

    • @wackyroo
      @wackyroo 2 ปีที่แล้ว +1

      \*Ceo lightmatter: *rapidly blinking* yeah so

    • @DROOBYDOO
      @DROOBYDOO 2 ปีที่แล้ว +6

      Yeah. So, it literally changes the game and I'm totally stoked. OMG, it's fire bro.

    • @johnkoetsier
      @johnkoetsier  2 ปีที่แล้ว +6

      :-)

  • @brigham1465
    @brigham1465 2 ปีที่แล้ว +38

    Asks literally any question,
    Ceo: "yeah, so"

    • @procactus9109
      @procactus9109 2 ปีที่แล้ว +3

      Dont forget the 'interesting reply to an ambiguous answer.

  • @johnsavard7583
    @johnsavard7583 2 ปีที่แล้ว +856

    My understanding about photonic computers is that they solve the low-k dielectric problem. Light beams that pass by each other side by side don't cause capacitance problems slowing each other down.

    • @N00B283
      @N00B283 2 ปีที่แล้ว +23

      Never considered that

    • @RealNovgorod
      @RealNovgorod 2 ปีที่แล้ว +232

      That's not really an issue, since photonic devices are still orders of magnitude larger than modern transistors and getting anywhere close to the few-nm range can't really be done at optical wavelengths, it requires DUV or even XUV light and noone has figured out the materials yet for photonics at these wavelengths. At the moment, the only advantage is the low energy loss for linear operations as well as potentially much higher clock speeds than silicon (but then again you're speed-limited by silicon I/O and memory).
      The real breakthrough will come with actual photonic switching (that nonlinear stuff), but right now this requires phase-controlled few-cycle laser pulses (table-sized laser source) and probably plasmonic nanostructures to get enough field enhancement for nonlinear switching. If that kind of stuff can ever be integrated on a chip, you've got yourself a few-100-THz transistor...

    • @dvl973
      @dvl973 2 ปีที่แล้ว +40

      @@RealNovgorod the problem of computing nowadays isn't really in the computing itself. It's in the communication on the chip. If we're going to see any advancements, I think it's going to be light communication inside our computers.

    • @mba4677
      @mba4677 2 ปีที่แล้ว +16

      @@RealNovgorod awesome comment, bro. how do you people know so much about this stuff lmao is it your work?

    • @RealNovgorod
      @RealNovgorod 2 ปีที่แล้ว +38

      @@mba4677 When you're juggling electrons around with a laser, you might pick up a bit or two about what it all could be useful for "in future". Don't get your hopes up too high though - actual photonic computers (not even the quantum flavor) are as far away as fusion energy, maybe even further.

  • @faustin289
    @faustin289 2 ปีที่แล้ว +460

    "linear algebra"; the math swiss army knife of computer science

    • @muzzletov
      @muzzletov 2 ปีที่แล้ว +4

      no? a swith army knife has mutiple tools built-in, linear algebra is a tool you can use in multiple ways. thats a completely different analogy. xD 204 updvotes know what there are voting for.

    • @creator-link
      @creator-link 2 ปีที่แล้ว +11

      muzzletov ?????

    • @mr.blackcat6223
      @mr.blackcat6223 2 ปีที่แล้ว +12

      @@muzzletov Learn how to spell please.

    • @davidmangus
      @davidmangus 2 ปีที่แล้ว +9

      Linear algebra is necessary as it garuntee linear separability and thus can be parallelized. Everything uses it because it has to in order to scale.

    • @mutestingray
      @mutestingray 2 ปีที่แล้ว +1

      @@muzzletov @Faustin yeah that’s right you’re wrong Faustin smh how embarrassing you should delete your comment maybe even your account

  • @spacefacts1681
    @spacefacts1681 2 ปีที่แล้ว +233

    This definitely feels like news I'd expect to read about in the year 2021

    • @thefirstsin
      @thefirstsin 2 ปีที่แล้ว +2

      Ah yes it's getting closer and closer the light computers exceeding our devices thousands of times.

    • @thefirstsin
      @thefirstsin 2 ปีที่แล้ว +3

      Holy he just said 20ghz.. Already at this year.. Graphite is 400+.

    • @oricooper9525
      @oricooper9525 2 ปีที่แล้ว

      @@thefirstsin graphene

    • @pingpong1727
      @pingpong1727 9 หลายเดือนก่อน

      @@thefirstsinwhere did you get 400 from?

  • @NanClaymore
    @NanClaymore 2 ปีที่แล้ว +392

    I'm worried that photonic logic might only output the binary state "yeah so"

    • @dissidentjaws
      @dissidentjaws 2 ปีที่แล้ว +15

      You read my mind so yeah lol

    • @RENO_K
      @RENO_K 2 ปีที่แล้ว +14

      Probably runs binary
      If it runs using wavelengths as signals, it'd just break everything 😂
      Transmitting 2 bits at the same time on the same cable is insane. 1 running UV 1 running IR.

    • @demonfedor3748
      @demonfedor3748 2 ปีที่แล้ว +15

      @@RENO_K UV doesn't look very power efficient to me. That sort of device would use tons of power. Using polarized FIR at multiple frequencies makes more sense here I believe. I wouldn't go as far as microwaves,but anything is possible. I am pretty sure they can do 128-bit per clock on the same cable with 64 colors and 2 different polarizations, however I might be wrong.

    • @NeoMK
      @NeoMK 2 ปีที่แล้ว +1

      The 64 colors can basically be tied to 64PSK which is used in SATCOM, however in SATCOM there is not only atmospheric attenuation but everytime you double PSK your power decreases by 3dB but it wouldn't in an optical chip so it's kinda cool. Interesting... ok so....

    • @krishanSharma.69.69f
      @krishanSharma.69.69f 2 ปีที่แล้ว +1

      Wow... are you an A.I. cause' you just found a pattern. That is what A.I.s are for.

  • @ioratv
    @ioratv 2 ปีที่แล้ว +553

    John: * asks question *
    Lightmatter CEO: Yeah, so

    • @theonewhobullies
      @theonewhobullies 2 ปีที่แล้ว +45

      INTERESTING

    • @joshuagavaghan224
      @joshuagavaghan224 2 ปีที่แล้ว +9

      This is just how nerds talk man. 😅 I noticed this too but because I respond to everything with "yeah, ....."

    • @gnaarW
      @gnaarW 2 ปีที่แล้ว +7

      And then he mostly dodged the questions ugggg

    • @Joshlul
      @Joshlul 2 ปีที่แล้ว +10

      this is the mark zuckerberg school of ceo speak

    • @tommylyeah
      @tommylyeah 2 ปีที่แล้ว +2

      @Steve Fox
      🍻🥴

  • @jimmyhackers8980
    @jimmyhackers8980 2 ปีที่แล้ว +734

    im waiting for a shadow based processor.

    • @ThePC007
      @ThePC007 2 ปีที่แล้ว +93

      There's a video game on Steam that is funnily enough also called "Lightmatter" and it's about a scientific experiment (that was performed by a company named "Lightmatter technologies") that has gone so wrong that the shadows have begun devouring every living thing they touch.
      Surely those shadows could be used as a computational resource somehow.

    • @ontoverse
      @ontoverse 2 ปีที่แล้ว +27

      That's what interference based approaches to optical processing do; there's even some experiments searchable here on youtube regarding that. Essentially you diffract the light on input and use interference at a fixed focal point as logic operations (since interference is a quantized phenomenon it works better for logic). Not sure how that miniturizes since you'd need very high frequency light to make the diffractors very small. Unfortunately that scales really poorly, since pretty soon you're in the x-ray diffraction realm!

    • @guai9632
      @guai9632 2 ปีที่แล้ว +12

      I worship His Divine Shadow

    • @3nertia
      @3nertia 2 ปีที่แล้ว +4

      @@ThePC007 Sounds like antimatter ...

    • @Synthetiks
      @Synthetiks 2 ปีที่แล้ว +9

      I guess you watched Vsauce video about shadow(dark) faster than light

  • @supersaiyancommenter
    @supersaiyancommenter 2 ปีที่แล้ว +60

    So instead of thermal throttling, we’ll have darkness throttling?

  • @jamesjensen5000
    @jamesjensen5000 2 ปีที่แล้ว +2

    I became familiar with light transfer and photonics channeling 25 years ago with work done by company called Silicon Graphics ... it has progressed slowly since since then.

  • @ripper82
    @ripper82 2 ปีที่แล้ว +17

    22 questions asked
    8 questions were ‘yes/no’
    20 answered with “Yeah…”

    • @johnkoetsier
      @johnkoetsier  2 ปีที่แล้ว +2

      :-) how much time do you have? btw, chatting with him again in a couple weeks and will ask all the top questions from these comments

  • @Sk4lli
    @Sk4lli 2 ปีที่แล้ว +32

    Very interesting. And kudos to the CEO to not sell it as a thing that will change everything as it's often done with new technologies. He clearly knows the strengths and isn't afraid to admin the current shortcomings.
    It's an additional thing to put in computers to speed up specific tasks, the ones that will become more important in the future.

  • @ezg8448
    @ezg8448 2 ปีที่แล้ว +160

    "Will it stay on the cloud?" About 10 mins in this question was asked.
    That means no one can buy a physical product, but only rent processing time from this company.
    That's a very big red flag in my book!

    • @SprDrumio64
      @SprDrumio64 2 ปีที่แล้ว +24

      Yep, i either completely own my tech or i dont buy it at all

    • @-wenschow907
      @-wenschow907 2 ปีที่แล้ว +33

      @@SprDrumio64 this really doesn't sound like consumer tech either way even if it can be bought and owned. Looks like they're just casting their PR net far and wide for another funding round.

    • @oblivion_2852
      @oblivion_2852 2 ปีที่แล้ว +8

      You don't need this amount of compute personally. The idea is to use this in the cloud to train networks (that can perform more efficient algorithms) and then use the network produced on traditional chips. Something like nvidia resolution scaling could be optimized even more by AI. Imagine the future where we have AIs on each end of encoding and decoding video streams giving 4k resolution at 1kb/s information streams.

    • @raptagames
      @raptagames 2 ปีที่แล้ว +12

      „Stay in cloud“ is not meant that they are renting out the computational power. It means there customer is Google and not a privat Person.

    • @aarongalicia3760
      @aarongalicia3760 2 ปีที่แล้ว +5

      “We are not a general purpose computer. We are a general purpose AI accelerator. I am not going to run Windows for you”
      They talked about releasing hardware by the end of the year but it’s not gonna be available to general consumers

  • @wolfram77
    @wolfram77 2 ปีที่แล้ว +107

    I was waiting to listen about some technical details, but this is a surface level discussion 😭

    • @prakharmishra3000
      @prakharmishra3000 2 ปีที่แล้ว +14

      He spoke a lot of bs. Didn't even show a second picture of the cpu let alone a prototype. When asked about price he changed the topic.

    • @savifrank6323
      @savifrank6323 2 ปีที่แล้ว +2

      Host lost me when he said modern computers have hundreds of gpus

    • @Fernando_Ribeiro
      @Fernando_Ribeiro 2 ปีที่แล้ว +9

      @@savifrank6323 well techinally a gpu is composed of hundreds of cores

    • @daath144occultistmaster5
      @daath144occultistmaster5 2 ปีที่แล้ว

      Ty for rhetoric

    • @Real_MisterSir
      @Real_MisterSir 2 ปีที่แล้ว +13

      @@savifrank6323 he said modern supercomputers - which may very well have hundreds of gpus in a network working on the same tasks - basically like renderfarming but on a single massive computer rather than tons of separate computers.

  • @HA7DN
    @HA7DN 2 ปีที่แล้ว +86

    One question I expected was "how could you do this before tech giants?"

    • @thefirstsin
      @thefirstsin 2 ปีที่แล้ว +3

      Idk man

    • @xponen
      @xponen 2 ปีที่แล้ว +31

      IBM and Intel also have this. My guess is they are like a hen sitting on eggs, they have something that works but not works well and didn't know how to sell it.

    • @tadficuscactus
      @tadficuscactus 2 ปีที่แล้ว +11

      They will probably buy him out.

    • @TheZenytram
      @TheZenytram 2 ปีที่แล้ว +30

      remember kodak, they didnt care about digital cameras
      remenber Tesla, they did a function eletric car, now every car companies are ruinning behind to deliver their owns, before was just concept car that they didnt care at all.
      the beginning of apple and the IBM at that time.
      those giant has their market and product, they can only "innovate " around theirs "golden egg goose".
      they can't compete with themselves with a "golden chicken egg".

    • @xponen
      @xponen 2 ปีที่แล้ว +12

      @@TheZenytram true, Kodak even had a patent for digital camera, but they don't innovate on purpose. Their "golden egg" was film-camera.

  • @SwedishDeathLlama
    @SwedishDeathLlama 2 ปีที่แล้ว +17

    At CES, probably 10 years ago, NVidia had a small demo of some of their own research into optical computing. Obviously no sales at that point, but speaking to the engineer about the possibilities was one of the most interesting things ever. So happy to see this tech coming to market!

    • @PravinDahal
      @PravinDahal 2 ปีที่แล้ว +5

      It's coming to the market like Nvidia's was coming to the market 10 years ago. Don't hold your breath.

  • @Alm8hoorOW
    @Alm8hoorOW 2 ปีที่แล้ว +174

    The fact that this firm hasn’t been acquired by big names in the industry speaks volumes in how much experts trust the technology.

    • @trjozsef
      @trjozsef 2 ปีที่แล้ว +11

      The CEO called electron leakage* "quantum tunneling," I wouldn't trust whatever he's selling.
      * From what I gather it is electromagnetic induction, but I'm not an expert either.

    • @S3thc0n
      @S3thc0n 2 ปีที่แล้ว +137

      @@trjozsef At scale of current gen semiconductors the method of leakage is quantum tunneling.🤦‍♀️
      An expert you are truly not.

    • @suagy7492
      @suagy7492 2 ปีที่แล้ว +45

      @@trjozsef Get your facts straight, the CEO didn't spew nonsense

    • @SandstormGT
      @SandstormGT 2 ปีที่แล้ว +7

      Not everyone is interested in selling out to big tech...

    • @Restrictted
      @Restrictted 2 ปีที่แล้ว +19

      Companies don't have to worry about people "buying" anything. I mean look at the iPhone. People buy it and it's years behind Android.

  • @JuanPretorius
    @JuanPretorius 2 ปีที่แล้ว +24

    Your questions are top notch, they're leading and for the advantage of the audience. Well done

  • @Baleur
    @Baleur 2 ปีที่แล้ว +321

    He needs to be really careful about answering Every question with "Yeah, so.."
    Or he's gonna end up promising things he never intended.

    • @jyvben1520
      @jyvben1520 2 ปีที่แล้ว +12

      and it is a bit irritating, hope his businessmeetings are better ...

    • @thetruthexperiment
      @thetruthexperiment 2 ปีที่แล้ว +22

      Yeah, so, that’s like how all young CEO’s talk and it’s scary.

    • @thetruthexperiment
      @thetruthexperiment 2 ปีที่แล้ว +16

      @@0s0sXD saying. “Yeah, so...” is a nervous, space filler such as “like” and “um”. It exemplifies the rapid degradation of civilization itself. 😝

    • @therealb888
      @therealb888 2 ปีที่แล้ว

      @UC3-EKrIcg6VtSjtEypjz64A he's not careful , thats the point genius

    • @Hickeroar
      @Hickeroar 2 ปีที่แล้ว +10

      I think the questions are tailored to be "yes questions." I expect they were predetermined. He's perfectly up front about what the limitations are, and isn't "over" promising anything. It's a limited featureset for specific jobs.

  • @ongeri
    @ongeri 2 ปีที่แล้ว +646

    Nice to see smart people working on some of my random showertime brainwaves, make me proud guys.

    • @blendpinexus1416
      @blendpinexus1416 2 ปีที่แล้ว +5

      dude SAME!

    • @cedriceric9730
      @cedriceric9730 2 ปีที่แล้ว

      😂😂😂😂😂

    • @cedriceric9730
      @cedriceric9730 2 ปีที่แล้ว +33

      Indeed , they would never have come this far without your input

    • @ongeri
      @ongeri 2 ปีที่แล้ว +13

      @@cedriceric9730that's just unthinkable! 😂

    • @WandererHermit
      @WandererHermit 2 ปีที่แล้ว

      yep

  • @richardmunsch8127
    @richardmunsch8127 2 ปีที่แล้ว +42

    Just another cold fusion until something actually reaches the market and proves not to be a load of natural organic fertilizer.

  • @EdgyShooter
    @EdgyShooter 2 ปีที่แล้ว +20

    Glad I'm doing a PhD in Photonics!

    • @meetoptics
      @meetoptics 2 ปีที่แล้ว +2

      Congrats!! Keep going 💪

  • @MassageWithKlay
    @MassageWithKlay 2 ปีที่แล้ว +53

    "Computers with friken lazer beams" .... looks like someone has been getting all Dr Evil on us :)
    Besides that, the chip set sounds promising, and should be really good, now just have to see how the future looks for them.

  • @madras61
    @madras61 3 ปีที่แล้ว +20

    Great interview. The focus of the company is more on deep learning for cloud computing in data centers for those mega houses like AWS, Azure and Google for now.
    Once those there use in their data centers, the experience will help Lightmatter to scale.

  • @MrCarburator
    @MrCarburator 2 ปีที่แล้ว

    Thanks John. I think your interview is right there at the forefront. Excellent! PS: Im not 100% sure but if its a large diaphragm mic you are using it may be pointed the wrong way. You may want to check its manual and get better sound free of charge so to speak. Cheers!

  • @mr_viss_
    @mr_viss_ 2 ปีที่แล้ว +121

    I guess this is the next company that Nvidia is going to buy.

    • @flynifty3951
      @flynifty3951 2 ปีที่แล้ว +9

      Or AMD

    • @mr_viss_
      @mr_viss_ 2 ปีที่แล้ว +16

      @@flynifty3951 I don't know about amd but Intel might try and get a hand on the gpu market...

    • @XJoukov
      @XJoukov 2 ปีที่แล้ว +3

      they should have bought this instead of ARM really --'...

    • @matrixace_8903
      @matrixace_8903 2 ปีที่แล้ว +4

      And shut it down ⬇️

    • @Leo99929
      @Leo99929 2 ปีที่แล้ว +10

      Nvidia, Intel, and AMD, are too smart to fall for this bullshit. 30 seconds on Google shows the guy has no idea what he's talking about. His stats are regularly out from reality by factors of 100 or more.

  • @BrutusPalmeira
    @BrutusPalmeira 2 ปีที่แล้ว +28

    A great first step would be to see one of these working even at 1990’s clock speeds. This could be another Theranos…

    • @AntonySimkin
      @AntonySimkin 2 ปีที่แล้ว

      I hope it works at least like the beast gpu's now but with the promised 1/10th of the wattage needed so I can build an android to be my waifu lol

    • @esuil
      @esuil 2 ปีที่แล้ว +4

      Yup, I could not find any information about this from any real people who have their solution. There is literally 0 information or demos if you try searching for it.
      Everything looks great on paper. But the fact that I can't find any actual hardware demos anywhere are concerning.

    • @justinthejerkoff
      @justinthejerkoff 2 ปีที่แล้ว +1

      @@esuil that's because it doesn't exist and it never will. The idea of "photonic" computing, or more accurately but much less cool sounding, optical computing has been around since like the 70s or 80s. And yet nobody has ever produced a chip that does anything near what's promised.
      It's all hot air.

    • @franklee663
      @franklee663 ปีที่แล้ว

      light travels at.... err speed of light, almost instantaneous. The only latency will be probably the DACs and ADCs for interfacing. Electrons also travel at the speed of light, but then the difference is how arithmetic is done. Just for addition of digital signals, you need to go through multiple steps (each step driven by a internal clock signal). But for analog, you just join two wires together and they just add up in one step. Yes, you can also do it with analog computing, but then you have to deal with external influence from EMF from other components and stuff, but light once sealed, you can't add or subtract signals from external sources.

  • @humanbass
    @humanbass 2 ปีที่แล้ว +368

    If it lives up to his claims, he will be richer than Bezos lol.

    • @sinephase
      @sinephase 2 ปีที่แล้ว +39

      doubt it, I don't think even nvidia+arm is that rich

    • @KP3droflxp
      @KP3droflxp 2 ปีที่แล้ว +64

      @@sinephase the dude would basically have a monopoly on the fastest GPU type processors

    • @sinephase
      @sinephase 2 ปีที่แล้ว +47

      @@KP3droflxp OK but I don't think even AMD combined with Nvidia is even that rich though LOL

    • @MyContext
      @MyContext 2 ปีที่แล้ว +6

      Unless he sells out Bezos or someone else which is common...

    • @owenb6499
      @owenb6499 2 ปีที่แล้ว +11

      It seems like right now, its not consumer at all. Hopefully eventually we can have all components be photonic.

  • @hollowmajin5146
    @hollowmajin5146 2 ปีที่แล้ว +50

    That's the funny thing about the "laws" of nature, aren't actually laws, they're simply descriptions of nature as we understand it, the laws don't change only our perception of how the universe works

    • @communist-hippie
      @communist-hippie 2 ปีที่แล้ว +2

      Smart said, never thought about it that way

    • @raptagames
      @raptagames 2 ปีที่แล้ว +9

      Moors law was never meant as a law, it’s more a Guideline.

    • @Drakeblood97
      @Drakeblood97 2 ปีที่แล้ว +4

      Well no, they are in fact laws. Saying that a law is 'just a description, so it can't be a law' is like saying a theory is 'just an idea, so it can't be true.' That's exactly what a law of nature is by definition (Moore's law isn't a law of nature). Laws aren't meant to "change our perception of how the universe works," they're just meant to formalize our observations so we have something written down to reference.

    • @BlastinRope
      @BlastinRope 2 ปีที่แล้ว

      @@Drakeblood97 Swing.... And a miss.

    • @hollowmajin5146
      @hollowmajin5146 2 ปีที่แล้ว +1

      @@Drakeblood97 I believe there is a misunderstanding, as you have simply rephrased what I said.

  • @trocha419
    @trocha419 2 ปีที่แล้ว +1

    I had a research paper I did on this for my physics class I’m high school in 05. I couldn’t figure out a solid ways back then for translating the light to electrical

  • @ultraviolet.catastrophe
    @ultraviolet.catastrophe 2 ปีที่แล้ว +8

    This such an awesome interview, guys. Very good questions asked, great answers given, and an overall wonderful introduction to the technology. Thumbs up! +1!

  • @ll_distribution
    @ll_distribution 2 ปีที่แล้ว +44

    This is a 10! Great interview John and how great communicator is this CEO, not short of impressive being clear, precise, articulate beside obvious a wiz in edge tech. Wow

  • @bananalord8575
    @bananalord8575 2 ปีที่แล้ว

    John this is the first video of yours I have watched and I have to say that you are verry goodi nterviwer. Smootly asked new questions the (same ones I had), not interupting him etc etc... good job

  • @suntzu1409
    @suntzu1409 2 ปีที่แล้ว +22

    "You can literally see it!"

  • @WildEngineering
    @WildEngineering 2 ปีที่แล้ว +5

    Im interested in how phase, polarization, and wavelength can be optimized to squeeze out more performance

  • @smjure
    @smjure 3 ปีที่แล้ว +5

    Good questions John, tnx

  • @lstan444
    @lstan444 2 ปีที่แล้ว +3

    Good interview, I like the way you structured the questions, real journalism!!!

    • @meetoptics
      @meetoptics 2 ปีที่แล้ว

      Agreed! So nice

  • @shaun6828
    @shaun6828 2 ปีที่แล้ว +1

    I wonder about the density and how they generate / receive the optical signals. Interesting ideas involved.

  • @Vermilicious
    @Vermilicious 2 ปีที่แล้ว +36

    Sounds very impressive, but I can't help thinking that it has to interface with traditional hardware that can't keep up.

    • @Danger-Tater
      @Danger-Tater 2 ปีที่แล้ว +6

      I don't think traditional hardware has to keep up, this is made for big super computers or ai learning centers and unless they optimize it and make it run graphic programs like games, graphic design programs and just be general use computers current hardware doesn't have to keep up, but Who knows what will happen on the next page of history which is what's exciting

    • @TheBestNameEverMade
      @TheBestNameEverMade 2 ปีที่แล้ว +13

      It's kinda like how your graphics card is 100x faster then your computer. It doesn't matter though cause all the computer needs to do is feed it the problems and take the output. The output could be generated at normal computer readable speed but the problems would take a CPU much longer to solve.
      In AI for example you are multiplying a huge matrix by another huge matrix.
      A traditional computer would take forever to multiply and add each element. With one of these you can take a large chunk and do all that work in the light processor and just spit out the multiplied matrix segments at the computers read rate.

    • @SinanAkkoyun
      @SinanAkkoyun 2 ปีที่แล้ว +7

      Well, imagine it like this:
      When you interface with a very fast photonic AI device, you only need to send the relevant data, the device does the heavy lifting with it.
      It is like typing into a calulator. Sure, your fingers are way slower than the calculator could process, but doing highly sophisticated algebra and so on would take you wayyy longer yourself than typing them into a black box that does the work :)

  • @P8qzxnxfP85xZ2H3wDRV
    @P8qzxnxfP85xZ2H3wDRV 2 ปีที่แล้ว +57

    I'd love to see this as a PCIe expansion card that I can use in my computer to accellerate raytracing in games. And obviously machine learning.

    • @ashtentheplatypus
      @ashtentheplatypus 2 ปีที่แล้ว +27

      So, you'd have light simulating light!

    • @fennadikketetten1990
      @fennadikketetten1990 2 ปีที่แล้ว +8

      @Rusty Clark you mean pseudo-raytracing right

    • @wombatillo
      @wombatillo 2 ปีที่แล้ว

      Drivers and driver support would be a nightmare. People have been talking about co-processors (FPGA, neural, etc.) for ages and nothing much has come out of them.

    • @P8qzxnxfP85xZ2H3wDRV
      @P8qzxnxfP85xZ2H3wDRV 2 ปีที่แล้ว +9

      @@wombatillo Isn't the GPU a co-processor?

    • @SianaGearz
      @SianaGearz 2 ปีที่แล้ว +3

      I can't see raytracing as a viable task for this topology, because the data structures needed for that are insanely complex. The only operations that this accelerates are multiply-accumulate on straightforward data structures, or basically matrix multiplication, which is computationally intensive but structurally extremely simple. So neural networks are an excellent match.
      But you know what i bet is coming? Cryptocoin mining.
      Other uses this will potentially adapt to easily are uniform finite element analysis physics simulations, such as fluid dynamics and load bearing structures. The latter powers things such as generative design, where you specify load bearing surfaces, and have the CAD generate an optimised structure that solves a given mechanical task. Warning, if you have trypophobia (anxiety of holes), maybe don't google "generative design", and also don't google "trypophobia". Also this hardware might be useful for things like protein folding, for medical research.

  • @slowfudgeballs9517
    @slowfudgeballs9517 2 ปีที่แล้ว +6

    I want a "Yeah so" counter in the top right.

  • @michaelsmith6818
    @michaelsmith6818 2 ปีที่แล้ว +34

    New drinking game. Every time a CEO starts his answer with "Yeah, so..." take a drink. You'll be bombed by the end of the video.

    • @vondahe
      @vondahe 2 ปีที่แล้ว +2

      In fact, every time people start a sentence with “so”, I stop listening. Doing that generally indicates two things, both of which turns off my interest.

  • @big.atom37
    @big.atom37 2 ปีที่แล้ว +15

    The term computer in the modern world usually refers to a piece of hardware that can perform all computational tasks. In other words, it's a machine with a general-purpose CPU inside. Hardware that is used to speed up some specific operation like matrix multiplication is usually called an accelerator.

    • @Kindlylisten3
      @Kindlylisten3 ปีที่แล้ว

      Nonbeliever why you choose that name?

    • @big.atom37
      @big.atom37 ปีที่แล้ว +1

      @@Kindlylisten3 What's wrong with it?

    • @Kindlylisten3
      @Kindlylisten3 ปีที่แล้ว

      @@big.atom37 Because i don't think we become nonbeliever but as a choise.
      We still remain agnostic.
      And Because God wants us to believe in him. I am not christian. I'm muslim. So God is only one. And muhammad is his messenger. And our afterlife is based on our belief and actions otherwise failure forever.

    • @big.atom37
      @big.atom37 ปีที่แล้ว +2

      ​@@Kindlylisten3 From where I stand, believing in gods is a weakness of human nature. Human brain is very limited in what it can process so it needs some simplified picture of the world to serve as a basis, which is religion for many.

    • @Kindlylisten3
      @Kindlylisten3 ปีที่แล้ว

      @@big.atom37 I'm talking about God not Gods. And that God said in his book that "He is the creator & expander of the universe". And also said he made the "best" in wombs of our mothers. Also, He made everything in "pairs". Also, The biggest sign is He himself challenging humans to write a verse like his Book. The Divine book i am referring to is Qur'an containing signs. Saved from a time About 650 A.D.
      Give it a thought!

  • @raymundhofmann7661
    @raymundhofmann7661 2 ปีที่แล้ว +25

    What is not mentioned here is that the calculations that can be performed here are lossy or have a limited signal/noise ratio because first it is D/A converted from electrical signals to optical radiation and then A/D converted back to electrical signals. The optical "processing" in between is just like a kind of "fourier optics" thing, a kind of instant convolution / correlation but with imperfections, or something that can be projected on it with reasonable benefit. There are applications for which the lossy and numerically unstable nature is acceptable, but for the majority of applications computers are used for it is not.
    As the whole AI and machine learning stuff is anyway a very fuzzy affair where no one expects deterministic results, this is consequently the market they say to bring some benefit to.

    • @nonconsensualopinion
      @nonconsensualopinion 2 ปีที่แล้ว +2

      Why do you think they have any A/D conversion? Binary thresholded electrical signals being encoded into binary light pulses and then thresholded back to electric.
      It's no different than fiber optics. Nobody thinks that sending your data over a fiberoptic cable yields reduced numerical accuracy on the receiving machine.

    • @raymundhofmann7661
      @raymundhofmann7661 2 ปีที่แล้ว +8

      @@nonconsensualopinion Your lack of understanding lets me hope you don't work as an engineer.
      In fiber optics there is no intentional signal processing or alteration happening over the medium, if it is like multi mode reflections or parasitic reflections at couplers / connectors, then these are unwanted and corrected by digital error correction and digital / analog or even electro-optical transmission line equalizers.
      And these are relatively easy to correct, meaning using little processing and electrical power. This is by design and mainly a function of how much information has to be convoluted to check for transmission errors and correct them to very low accepted rates, which never are zero, btw.
      With this "photonic computing", whatever processing or function is going on in the medium, from electrical signal in to electrical signal out, which is a necessity simply because they advertise it can be integrated into a existing data center, more or less evades from any error correction.
      Do you know fourier optics? Do you know what a 4f correlator is?
      You can "optically calculate" a 2d fourier transform blazingly fast, basically at the speed of light, but you would also have to convert your data to optical (D/A) and then convert it back to digital (A/D). This combined with the optical imperfections of the used elements basically determines a signal to noise you might achieve, how lossy your processing function is.
      And there is no way to completely get rid of this except than doing the same calculation which happened in the "photonic computer" again on a classic computer. As said, a huge convolution of information would have to be done on the electronic side, basically the optical calculation repeated, to get to the wanted bit error rate, making the optical calculation redundant.
      Of course there could be some clever ways to improve on that with viable measures on the electronic sides of the "photonic computer", meaning to improve on the signal to noise or give the noise some characteristics more suitable for the intended application, but these definitely will not get rid of the noise, they will just make this lossy computing more viable for certain applications, but they will never rival the low and cheap to get error rates of classic computers.
      A classic computer is also lossy, has a error rate or a signal to noise if you want to say, but this can be made so amazingly low that human factors like human lifetime easily make the dominant "error rate".

    • @yimoawanardo
      @yimoawanardo 2 ปีที่แล้ว +5

      @@raymundhofmann7661 Everything you said is good, I just hoped you were a little less salty when answering '^-^. Thanks for the infos though

    • @Scholzey
      @Scholzey 2 ปีที่แล้ว +1

      @@raymundhofmann7661 what node are these on? Surely your not going to fit billions of optical-electrical transmitters and recievers on a chip.
      Initially I thought they were swapping copper traces with optical tubes. But if they are multiplexing, then obviously not.
      I didn't really hear how it actually worked, what changes states etc to allow the light to do the work instead of just carrying information.

    • @tetraedri_1834
      @tetraedri_1834 2 ปีที่แล้ว

      @@raymundhofmann7661 My first instinct to reduce the effect of noise would be to couple the data with error correction code, so that CPU can detect and fix errors due to noise later (assuming noise is below the threshold of the used error correction code). Of course, this would require homomorphic error correction codes (i.e. codes that are preserved under evaluation of functions), which I don't know how much they are studied. But if photonic computation becomes big, I bet research in homomorphic error correction codes would become a big field

  • @meetoptics
    @meetoptics 2 ปีที่แล้ว

    So nice to see that the industry is developing and evolutioning. Let's see how this project progresses, but the idea seems promising. From MEETOPTCS we are proud to be part of the photonics community and to help engineers and researchers in their search for optical lenses through our site. We celebrate every step forward. Congratulations!

  • @ComputerHistoryArchivesProject
    @ComputerHistoryArchivesProject 11 หลายเดือนก่อน

    Highly fascinating video, great detail in the explanation. Thanks for making and sharing this!

  • @CanaldoMiauOfficial
    @CanaldoMiauOfficial 2 ปีที่แล้ว +91

    Cryptominers be like: Yeeeeeeeeeeeah boi.

    • @360addict70
      @360addict70 2 ปีที่แล้ว +23

      not really. if anything it'll devalue crypto by making easier to make. crypto is valuable because it's hard to make.

    • @CanaldoMiauOfficial
      @CanaldoMiauOfficial 2 ปีที่แล้ว +5

      @@360addict70 Yes you're right, but it solves the problem of the energetic consuming made up because of cryptocoins, so is nice for people who want to sell NFT art. And probably for now only a few people will have this kind of "photonic computer.", but I got you, you're right.

    • @Zephyrus0
      @Zephyrus0 2 ปีที่แล้ว +7

      They will just create harder to crack hashing algorithms, take Monero's RandomX for example, it's made in such a way that GPU and ASICs suck at solving them.

    • @JorgetePanete
      @JorgetePanete 2 ปีที่แล้ว +1

      Remember that proof of stake is becoming common

    • @xSuspect-7
      @xSuspect-7 2 ปีที่แล้ว +4

      @@360addict70 Not really, it would have the same effect we have today
      Weaker GPUs can mine but the value is not that high bc we have stronger GPUs mining faster dictating the price
      If this could be a reality those with the new technology would dominate the market and everyone else with weaker GPUs will gain less

  • @mikimouse3001
    @mikimouse3001 2 ปีที่แล้ว +142

    To me this smells like that " brand new revolutionary battery technology" all over again.

    • @ObliviouslyAware
      @ObliviouslyAware 2 ปีที่แล้ว +12

      That's still a thing materials keep changing though so R&D funding keeps getting changed around, right now the hot option is graphene ultra capacitors, you're still not wrong though.

    • @goku21youtub
      @goku21youtub 2 ปีที่แล้ว +9

      its sad that youtube doesnt show downvotes on comments anymore... it would hurt your feelings for real lol

    • @IARRCSim
      @IARRCSim 2 ปีที่แล้ว +14

      It smells a fraud to me. It reminds me of Theranos. It sounds too good to be true and they're not claiming to deliver until the end of year so we'll see then if that gets delayed, disappoints, or ever becomes available for outsiders to fairly compare with more conventional technology at all.

    • @gabrielvieira6529
      @gabrielvieira6529 2 ปีที่แล้ว

      ???

    • @stevenarvizu3602
      @stevenarvizu3602 2 ปีที่แล้ว +9

      The technology exists, it’s a matter of logistics. If you can’t find
      >investors
      >workers experienced in the field of “your new invention”
      >access to material goods through inexpensive or cost effective means
      You may as well not even make whatever you have until you do.
      This is why most “ground breaking” technology disappears and then comes back ten years later.
      When Pixar released movies with ray tracing it was amazing! They were like “once we figure out how to do this quicker it’s gonna be so mainstream!” And then no one ever really used it again.
      Then ten years later Nvidia figured out how to do it in real time and now it’s back again.
      These things take time, unfortunately

  • @antonnym214
    @antonnym214 2 ปีที่แล้ว +5

    Excellent interview! This is hyper-interesting technology, and certainly kudos for the spectacular advancements made at Lightmatter. A gentle note for Mr. Harris, which will help especially when relating and communicating with anyone over 40: Try not to start every answer with "Yeah so." I would recommend just not using it to begin ANY sentence. One need only view a single episode of Shark Tank, for example, to see the effect that can have on discourse. All good wishes!

  • @Mr.Unbreakable83
    @Mr.Unbreakable83 2 ปีที่แล้ว

    Hey John just wanted to let you know that I subscribed to your channel because of this video it's super interesting and I would love to learn more about photonic computing.

  • @fergarram
    @fergarram 2 ปีที่แล้ว +4

    This was a great interview! Very clear and easy to follow, thanks!

    • @meetoptics
      @meetoptics 2 ปีที่แล้ว

      Agreed! So nice

  • @danfg7215
    @danfg7215 2 ปีที่แล้ว +130

    Sounds like the overhyping of No Man’s Sky

    • @EversonBernardes
      @EversonBernardes 2 ปีที่แล้ว +20

      The claims aren't so outlandish, actually. Vast majority of the inneficiency from current chips actually comes from moving data around inside the chips. As you miniaturize, resistance increases, you get more parastic loss and quantum effects, meaning you have to ramp up voltages to achieve the same frequencies. Just by moving to light-based data transport through waveguides you are already making huge savings in power (so, less waste heat), what also allows you to ramp up frequencies.
      Main issue is steering the monumental ship that is semiconductor technology to adopt a completely new paradigm in production.

    • @danfg7215
      @danfg7215 2 ปีที่แล้ว +11

      @@EversonBernardes oh I don't doubt the tech, just how nonchalantly the guy over there just throws multipliers and promises like nobody's business. I'll believe it when I see it. No Man's Sky eventually delivered, but it took years of actual hard work after the disastrous launch and the lead designer shutting up about overhyped public statements.

    • @____uncompetative
      @____uncompetative 2 ปีที่แล้ว +1

      @@danfg7215 What are you talking about?
      _No Man's Sky_ delivered on everything that was in its Steam product description at launch.

    • @danfg7215
      @danfg7215 2 ปีที่แล้ว

      @@____uncompetativehmm unless you're being sarcastic, you can learn more about the game's prerelease overhype in this random summary video: th-cam.com/video/PvCg0Cz7n4I/w-d-xo.html

    • @____uncompetative
      @____uncompetative 2 ปีที่แล้ว +1

      @@danfg7215 I am not talking about anything said in early interviews. Games can change for aesthetic, technical, or business reasons up until the point of preorder. What they then promise they should satisfy. What Sean Murray put in the Steam product description was delivered by the game at launch. Games can be hyped up for years and then cancelled and not come out at all. _No Man's Sky_ released. Sean Murray never lied:
      th-cam.com/video/wyV9XOtEh1A/w-d-xo.html

  • @ThePhotovoltaicMan
    @ThePhotovoltaicMan 2 ปีที่แล้ว +1

    how many different colors could you run through a core? every nm of wavelength? or every 12nm, 20nm, 50 etc?

  • @jksharma7
    @jksharma7 2 ปีที่แล้ว

    wonderful insight information.... Thanks to both of you.

  • @swifty1969
    @swifty1969 2 ปีที่แล้ว +40

    So how come amd, intel, nvidia, Amazon, google are not on a bidding frenzy over this company?

    • @vallorahn
      @vallorahn 2 ปีที่แล้ว +2

      You forgot Apple and Tesla.

    • @DouglasWatt
      @DouglasWatt 2 ปีที่แล้ว +30

      Intel and IBM have been doing minor research projects in optical computers for years, but there wasn't enough market for narrow-focus chips until neural networks became a thing, and the current state of optics requires designing narrow-focus chips, like tensor math chips. Also, highly controlled nano-scale photon generation is a pretty recent invention, and that has prevented optical circuits from keeping pace with silicon improvements. But a lot of improvements have happened over the last 10 years, and IBM has been increasing their photonics & optical computer division. If Lightmatter's product does what it says it does once its in the cloud, they will definitely be the target from the major players. But being a target doesn't guarantee that Lightmatter agrees to a buyout either.

    • @trueneutral3092
      @trueneutral3092 2 ปีที่แล้ว +3

      Maybe this is the company that's going to tell them to go get bent.

    • @xMilesxHighxClubx
      @xMilesxHighxClubx 2 ปีที่แล้ว +1

      @@trueneutral3092 Lol

    • @simon4133
      @simon4133 2 ปีที่แล้ว +1

      They will once it achieves a certain market share, at which point they will add their government approved spyware called Intel ME.

  • @jackburton5085
    @jackburton5085 2 ปีที่แล้ว +11

    "Now there's actually one that you can order" ....

    • @therealb888
      @therealb888 2 ปีที่แล้ว +4

      But the price is unknown

    • @imelitist2828
      @imelitist2828 2 ปีที่แล้ว +4

      Well atleast scalpers can't get to it

  • @xnadave
    @xnadave 2 ปีที่แล้ว

    I took a class on photonic bandgap crystals in grad school. For our project, one of my classmates designed an entirely optical XOR gate. It's a really fascinating field. (Ha. Field. Get it?) Kudos to the interviewee for breaking down a really complex topic into easily-digestable terms.

  • @localnyraccoon
    @localnyraccoon 2 ปีที่แล้ว +2

    Excited to see where this is going, if it can actually make it's way into personal computers that would be amazing.

  • @seasong7655
    @seasong7655 3 ปีที่แล้ว +6

    I've tried some machine learning before and wish I've had this in my PC.

  • @roger_isaksson
    @roger_isaksson 2 ปีที่แล้ว +7

    Harris seem like a chill dude in charge of an operation. 🤔👍

    • @therealb888
      @therealb888 2 ปีที่แล้ว +1

      They all seem like that until you work for them.

  • @patrick6110
    @patrick6110 2 ปีที่แล้ว

    Excellent interview! Thank you for posting.

    • @meetoptics
      @meetoptics 2 ปีที่แล้ว

      Agreed! So nice

  • @El.Duder-ino
    @El.Duder-ino 6 หลายเดือนก่อน

    Very cool, thx for sharing this interview!👍 This was 2 years ago, where is this technology now?

  • @ChernzObyl
    @ChernzObyl 2 ปีที่แล้ว +3

    Correct me if I'm wrong but this should revolutionize blockchain applications

  • @anubisplays1421
    @anubisplays1421 2 ปีที่แล้ว +22

    Surley you would want to show your tech off, but no demonstration? sounds suspect to me.

    • @MonsterJuiced
      @MonsterJuiced 2 ปีที่แล้ว +4

      Was gonna say this. Why is there no demonstration?

    • @HDSQ
      @HDSQ 2 ปีที่แล้ว

      Considering that it's mostly for AI and number crunching, I doubt there's much that can really be shown at this point besides a wall of text on a screen, which isn't very interesting to most people.

    • @xSuspect-7
      @xSuspect-7 2 ปีที่แล้ว

      @@HDSQ Make comparisons would do, but yeah for general public wouldn't make sense

    • @J3R3MI6
      @J3R3MI6 2 ปีที่แล้ว

      Seeing some graphics processing or rendering would be nice

    • @esuil
      @esuil 2 ปีที่แล้ว

      @@HDSQ I was interested and could not find even that anywhere. So yeah...

  • @randfee
    @randfee 2 ปีที่แล้ว +1

    nice interview, I was wondering when somebody would create a chip like this! FINALLY!
    Is there some kind of technical video on how the chip works fundamentally, how it does math and about the architecture?
    I think this would be great to show some of my students, I'd be happy to promote it on LinkedIn and use it/link to it on conferences!
    Kudos to the entire team at lightmatter and best of luck. I want to see real-time 120fps+ 8k raytracing for AR/VR applications running on these in 10 years ;-)
    ... a fellow photonics (lasers) physicist!

  • @jonnupe1645
    @jonnupe1645 2 ปีที่แล้ว +2

    This is exciting! I can't wait to process local weather in real time on a hopefully (eventually) smaller form factor and near guarantee of lower amount processing power

  • @peepee6187
    @peepee6187 2 ปีที่แล้ว +3

    Does multicolor processing (=wavelength division multiplexing for computing) really work for "efficiency"? I really want to know the detail. Which paper should I read?

    • @chiefgully9353
      @chiefgully9353 2 ปีที่แล้ว

      Any paper on wave spectrum operstions.
      The phenomena they are talking about is in use with radio waves today.
      Each color is a different wave form. If sufficiently separated in wavelength/frequency the two waves do not interfere with one another.
      Since unlike electrons photons have no mass you can in theory have photons of multiple frequencies in the same local area (the optical path).
      The difference is that in modern methods (radio waves) we use modulation. I am not sure if that's the case here. Though I simple detection system of logic green light = true then emit on = true would not need modulation and still would be able to share a logical path with other colors.

    • @markell1172
      @markell1172 2 ปีที่แล้ว +2

      @@chiefgully9353 lol, actually the reason of why you can have an uncountable number of photons all over the same position is because of their bosonic nature not because of their mass.

    • @chiefgully9353
      @chiefgully9353 2 ปีที่แล้ว

      @@markell1172 boson8c nature sounds like your getting into quantum physics. That's a bit more advance study than I got. I instal and design telecomunications systems, so I understand the wave principles behind that. The mass reference was an I tuition thing 0 mass particles and all.
      I'll look more into that though thanks for the heads up.

    • @markell1172
      @markell1172 2 ปีที่แล้ว

      @@chiefgully9353 np, I normally help pep to not get on missinfo so im glad to see that you took it like it should :)

    • @xavieradriaens4411
      @xavieradriaens4411 2 ปีที่แล้ว

      Fourrier transforms....

  • @tommykarlberg
    @tommykarlberg 2 ปีที่แล้ว +13

    So optical graphic cards will be possible? Looking forward to that. :D

    • @smittywermen8418
      @smittywermen8418 2 ปีที่แล้ว +2

      It’ll take rgb to the next level

    • @Tigrou7777
      @Tigrou7777 2 ปีที่แล้ว

      I am pretty sure some non tech people are thinking what happen between in a GPU might be optical. Well that might be a thing for real in the future.

  • @ntrpnr
    @ntrpnr 2 ปีที่แล้ว

    How do you load the model onto the chip? Is each chip custom made for its inference model?

  • @emmanueloluga9770
    @emmanueloluga9770 2 ปีที่แล้ว +1

    What a great video, even after a Year of first viewing it. I have followed LightMatter for almost 4 years now, and I am pumped for all they have to accomplish. GG.
    14:42 is a great Anecdote for what optical computers are not good at currently. This most definitely needs a fundamental rethinking of the approach to Logic operators (Maybe Fredkin Gates?

  • @Daddelpalm
    @Daddelpalm 3 ปีที่แล้ว +3

    Are the networks also trained with the photonic computer or are they just used to deploy the trained network? Since training the networks take a lot of time it would be interesting to know.

    • @johnkoetsier
      @johnkoetsier  3 ปีที่แล้ว +4

      I believe they are trained on the photonic computer, but I'm not certain.

    • @falconux7006
      @falconux7006 2 ปีที่แล้ว +2

      No, he said they are focusing on inference, so the neural net is already trained.

  • @asdfasdf71865
    @asdfasdf71865 2 ปีที่แล้ว +5

    If you can make a matrix multiplication, you can simulate quantum computer and implement traditional logic with the trofolli gate

    • @cold_ultra
      @cold_ultra 2 ปีที่แล้ว +5

      yes of course

    • @nutzeeer
      @nutzeeer 2 ปีที่แล้ว +1

      i knew that

  • @josephpadula2283
    @josephpadula2283 2 ปีที่แล้ว

    In 1973-4 I worked in a scuba shop filling tanks while in 10th grade.
    The Scuba club from Stevens Institute of Technology in Hoboken NJ came to fill tanks and go in group dives.
    One student was always talking about how computers were really going to take off when photonics replaced electronics according to his Prof. The IBM 360 was the standard
    Large computer at that time and the PDP-7?
    The standard small one.
    I do not remember his name and never knew the Profs name but find this TH-cam interesting.

  • @tdata545
    @tdata545 2 ปีที่แล้ว

    First of all, very neat. When you say 10x faster, what's its computing strength? How many operations can it perform? Is it still binary? Or do we have to basically re-write the wheel or would the conversion from this language to traditional machine x86/x64 negate the 10x speed improvement? Also, is this competing against and how is it different from Quantum chips?

  • @joey286
    @joey286 2 ปีที่แล้ว +7

    Great video... and yes to answer your last question it is 100% possible to have local AI built in. Also, I am going to create a optical computer that is better than this one. hehe. Good luck Nicholas! Cheers.

  • @The_Slavstralian
    @The_Slavstralian 2 ปีที่แล้ว +12

    My only issue is this basically is targeted at massive companies to allow them yo cut their costs (energy use wise) so they can better serve us the ads we all hate.....
    Thanks for that mate

    • @joemerino3243
      @joemerino3243 2 ปีที่แล้ว +1

      Regular computers also started off as government/corporate monstrosities.

  • @zeropointclocker4228
    @zeropointclocker4228 2 ปีที่แล้ว

    I had some options, until I scrolled down to see they were all covered lol. Liked and subbed as well. Good stuff.

  • @woodstoney
    @woodstoney 2 ปีที่แล้ว +1

    Great video bringing new light to photonic computing! ;)

    • @meetoptics
      @meetoptics 2 ปีที่แล้ว

      Agreed! So nice

  • @sinephase
    @sinephase 2 ปีที่แล้ว +9

    it's the switching that's really limiting is it not? also IDK if I missed it but is this the same as quantum photonic computing?

    • @nullifier_
      @nullifier_ 2 ปีที่แล้ว +3

      Quantum computing uses properties like entanglement meaning that a "signal" can be either 1 or 0 but also a super-positioned value (what would be interpreted as "both at the same time")
      As far as I understood this one still uses classic logic gates but using light to transmit a signal instead of electrons

    • @dgSolidarity
      @dgSolidarity 2 ปีที่แล้ว

      @@nullifier_ Seems to me someone maybe didn’t watch the video...
      I'd say he was explicit about how they can’t employ transistor-like gate functions, “we don’t do that kind of thing”

    • @vibaj16
      @vibaj16 2 ปีที่แล้ว +1

      dgSolidarity They can, but he said it’s really hard to do, but it apparently is good at natively running matrix multiplication type operations.

    • @dgSolidarity
      @dgSolidarity 2 ปีที่แล้ว

      ​@@vibaj16 Okay… he said himself they don’t in his own words, and explains that they couldn't. Not sure how you’re arguing against him about his device.
      Do you mean “they” as in anybody can technically make that? If so, I really don’t get what you’re getting at. No one makes such devices commercially because there’s no evidence it could be practical. One can do all manner of things, that don't have any advantage.
      This video is about a kind of computer that may have significant advantages for a niche, which couldn't do so if it used classic logic gates, but okay I guess it’s not illegal so you all can believe that’s how it actually works if it floats your boat.

    • @vibaj16
      @vibaj16 2 ปีที่แล้ว +1

      dgSolidarity Show where he said it’s impossible. I don’t remember him saying that. I remember him saying that it’s just really difficult.

  • @plekkchand
    @plekkchand 2 ปีที่แล้ว +4

    The engineer answers every question, oddly, with "Yeah". This may be the new "So ", which had recently taken the place of a direct acknowledgement of the questioner characteristic of traditional conversational etiquette, and was widely adopted by think tank people partially to convey the implication that they were not in the same universe as the lowly interviewer. It's as if the two parts of the conversation were simply simultaneously occurring, with gaps provided.

  • @pankajbajaj9578
    @pankajbajaj9578 2 ปีที่แล้ว +1

    Light reactors that could radio actively transfer objective energy flowing at radio active velocities printing any object any scale on any universe inside any black hole.

  • @ViktorFerenczi
    @ViktorFerenczi 2 ปีที่แล้ว

    Is it usable for training as well? I guess so, but not sure how efficient would that be.

  • @PASTRAMIKick
    @PASTRAMIKick 2 ปีที่แล้ว +13

    They found a wrecked terminator in a factory and based on its technology are now developing this computer, nice.

    • @davestorm6718
      @davestorm6718 2 ปีที่แล้ว

      😎

    • @luke144
      @luke144 2 ปีที่แล้ว

      🦾

    • @richystar2001
      @richystar2001 2 ปีที่แล้ว

      If you examine and study the evolution of the micro transistor chip ...its a very boring and a slow read....everyone likes to think that alien tech or something was used to advance breakthroughs in science. But the evolution of technology is slow and very uninteresting.

    • @davestorm6718
      @davestorm6718 2 ปีที่แล้ว

      @@richystar2001 - I wouldn't say it's really that boring, but, indeed, people that don't know how, for example, integrated circuits got smaller and smaller, need to look into photography/optics and photo-chemistry. These 2 fields had the most impact on IC construction, initially, and revolutionized the electronics industry the most (the application of microfilm tech, basically).
      The rest is quite ordinary incremental advancements (though with the "Law of Accelerating Returns" that is in play w/regards to all other technology and scientific advancements that would be impossible w/o the computer).

  • @MohammedIqlasUddin
    @MohammedIqlasUddin 2 ปีที่แล้ว +6

    Every answer starts with "Yeah so" on a photonic computer

    • @rRobertSmith
      @rRobertSmith 2 ปีที่แล้ว

      So is that a voice coach problem or a lack of rehearsal (real honest answers!) problem then?

    • @Danger-Tater
      @Danger-Tater 2 ปีที่แล้ว

      @@rRobertSmith yeah so... Not rehearsing questions is bad for your company's rep because people might consider you incompetent and you don't know what you're doing

  • @maciejajewski
    @maciejajewski 2 ปีที่แล้ว

    John you are very eloquent. Can easily tell you are a smart man from the quality of your questions. Definitely subscribed

    • @meetoptics
      @meetoptics 2 ปีที่แล้ว

      Agreed! So nice

  • @ihorkarpiuk4102
    @ihorkarpiuk4102 2 ปีที่แล้ว

    It's so good to learn such things right at the time when they come out and not way before and have to wait

  • @chicolofi
    @chicolofi 3 ปีที่แล้ว +10

    I don't care. I want a personal photonic computer.

  • @Wilhuf1
    @Wilhuf1 2 ปีที่แล้ว +12

    So, a fast matrix calculator integrated with traditional silicon.

  • @aliuzel4211
    @aliuzel4211 2 ปีที่แล้ว

    Super interesting. Enjoyed. Thank you.

  • @theninadkumthekar
    @theninadkumthekar 2 ปีที่แล้ว +24

    Main question he should be asking is 'Can it run crysis?'.

  • @forestswayngim8290
    @forestswayngim8290 2 ปีที่แล้ว +8

    I feel like there a room of engineers yelling through a screen at this ceo with all the "I think someday" he be throwing out there lol

  • @croeternal
    @croeternal 2 ปีที่แล้ว

    This tehnology can be easily used in rendering on mass scale such as Metahero crypto that scans person turning into hd 3d models...

  • @markus.schiefer
    @markus.schiefer 2 ปีที่แล้ว

    I'd really like a GPU like this. Would also settle for 2-3 times the speed at half the power requirement. Would also make a lot more money I think.

  • @cesarmonroy-olivares4125
    @cesarmonroy-olivares4125 2 ปีที่แล้ว +52

    You lost me when you said you won't run Windows... JK

    • @RandyRandersonthefamous
      @RandyRandersonthefamous 2 ปีที่แล้ว +3

      Maybe Linux will get some funding!

    • @IIARROWS
      @IIARROWS 2 ปีที่แล้ว +3

      @@RandyRandersonthefamous That's not the joke

    • @babykosh5415
      @babykosh5415 2 ปีที่แล้ว

      yeahhh, this is not for that

    • @strangevideos3048
      @strangevideos3048 2 ปีที่แล้ว +1

      Fuck that shit,people hate windows!

    • @therealb888
      @therealb888 2 ปีที่แล้ว +3

      Does ray tracing though.