This Guy BUILT His Own Graphics Card!

แชร์
ฝัง
  • เผยแพร่เมื่อ 17 ธ.ค. 2024

ความคิดเห็น • 598

  • @dbarrie
    @dbarrie 7 หลายเดือนก่อน +3372

    Thanks guys! Was a ton of fun sitting down and chatting with you for this!

    • @DigitalYojimbo
      @DigitalYojimbo 7 หลายเดือนก่อน +93

      Amazing work.

    • @MrCraftr
      @MrCraftr 7 หลายเดือนก่อน +56

      Really inspiring man well done!

    • @Metal_Maxine
      @Metal_Maxine 7 หลายเดือนก่อน +29

      Awesome

    • @Ekoahamgames
      @Ekoahamgames 7 หลายเดือนก่อน +30

      Berry superb work 😅 dude 😎 I am on my way to create my os just need some people on board

    • @Eeshank2
      @Eeshank2 7 หลายเดือนก่อน +21

      absolute mad man

  • @lubomirslavov1924
    @lubomirslavov1924 7 หลายเดือนก่อน +1891

    gpu prices so high that this guy made his own

    • @JakeB-Real
      @JakeB-Real 7 หลายเดือนก่อน +17

      Lol

    • @kstarler
      @kstarler 7 หลายเดือนก่อน +58

      4 years of income for a single GPU? Sounds about right. Thanks Nvidia!

    • @yaroslavpanych2067
      @yaroslavpanych2067 7 หลายเดือนก่อน +18

      You obviously have no idea how much Xilinx FPGA development stuff costs

    • @adriancoanda9227
      @adriancoanda9227 7 หลายเดือนก่อน +5

      @@yaroslavpanych2067 he paied more that for off the shelf product all things custom what if it could render direct 3d y mean the arm x elite can

    • @tim3172
      @tim3172 7 หลายเดือนก่อน +4

      @@adriancoanda9227 "paied"
      Derp.

  • @zentiremusic123
    @zentiremusic123 7 หลายเดือนก่อน +425

    I´m an electrical engineer myself also designing and building boards. It´s very inspiring what the guy did. Thx for the motivation !

    • @huzaifazkansa
      @huzaifazkansa 7 หลายเดือนก่อน +3

      Really you are these engineers to build motherboard too well do you build server motherboard too .

    • @clementpoon120
      @clementpoon120 7 หลายเดือนก่อน +1

      as someone who had recently build their very first (almost) designed from scratch SBC with 80s hardware that guy is a magician

    • @gaborm4767
      @gaborm4767 6 หลายเดือนก่อน

      ​@@clementpoon120CPU type?

    • @CodesExplorer-hb1wr
      @CodesExplorer-hb1wr 6 หลายเดือนก่อน +1

      I'm 15yo and i have my first designed board all by me

  • @kaalsemulzii1920
    @kaalsemulzii1920 7 หลายเดือนก่อน +710

    I mean he demonstrated what Terry A. Davis did with TempleOS. One man CAN do things like this. HE IS A MAN OF FOCUS AND SHEER FUCKIN' WILL.

    • @LaczPro
      @LaczPro 7 หลายเดือนก่อน +7

      A John Wick you might say 👀

    • @milasudril
      @milasudril 7 หลายเดือนก่อน +20

      A driver for TempleOS

    • @wilhelmbittrich88
      @wilhelmbittrich88 7 หลายเดือนก่อน +29

      RIP Terry Davis. Got too powerful so the government had to take him out.

    • @smorrow
      @smorrow 7 หลายเดือนก่อน +4

      @@milasudril Actually, I think the most TempleOS way to drive this card would be to reprogram the card to exactly implement the TempleOS graphics primitives.

    • @eenayeah
      @eenayeah 7 หลายเดือนก่อน +4

      I wager one singular man can do a lot of things, long as there sheer will and commitment, time, and resources.

  • @MarioGoatse
    @MarioGoatse 7 หลายเดือนก่อน +140

    Wait!? So he’s actually running games on this thing? I thought it was going to be a super basic GPU that could technically render but only low resolution images in a very basic engine. I didn’t expect it to be running real games. He’s actually running Quake 2 at 720p? That is incredible. Great work brother. I’m so proud of this guy!

    • @jarsky
      @jarsky 7 หลายเดือนก่อน +20

      It is "basic", hence why he's using Quake II to demo it, as it's not a 3d accelerated game engine or have dynamic lighting, shaders, etc.... But even building a basic GPU like this is an insane display of skill, knowledge and intellect. Especially given all the trade secrets around GPU development and it doesn't seem like he's involved with one of the top 3

    • @MegaVidsMike
      @MegaVidsMike 7 หลายเดือนก่อน +1

      maybe with more hard work and dedication from this guy we can get annother competing brand of graphics cards in the future. may not be any time soon, but seeing people with the skill and intellect that this guy has gives me hope for the future.

    • @Argoon1981
      @Argoon1981 7 หลายเดือนก่อน +1

      @@jarsky idtech 1 add a 3D accelerated render as well, yes it has a software rendering path, to run on the CPU alone but it was updated by idSoftware to support a OpenGL render that is 3d accelerated and he used vulkan and a GPU, so he had to use the hardware render part.

    • @djohannsson8268
      @djohannsson8268 6 หลายเดือนก่อน +1

      Open source GPU design?.
      It would be fun to look at it.
      Sounds like he implemented the 2d part of the drivers. The Bom part cost would be expensive. A working NV 1650 starts sounding real good.
      I wonder if he used the arm processor to preprocess 2d commands into direct lower level hardware operations.

    • @goyworldorder
      @goyworldorder 5 หลายเดือนก่อน

      @@MegaVidsMike it's never going to happen. do you have any idea how expensive the tools to make a GPU chip are? hence why the semiconductor hegemony has control over chip/graphics card manufacturing. good luck finding 7 billion dollars to build a photolithography plant lol

  • @InnaciKorushka
    @InnaciKorushka 6 หลายเดือนก่อน +49

    So I'm an electrical engineer, and it feels understated what exactly this man has accomplished. You can't simply read a book or watch some youtube vids and tinker to make a gpu from scratch like he did. This is an incredible outcome for a dauntless task. Extraordinary intuition and computational mindset. It's really incredible.

    • @2DarkHorizon
      @2DarkHorizon 5 หลายเดือนก่อน

      why is it so difficult though. because from the software side the rendering engine is well known what are its requirements and features. you just need to build a hardware around it.

    • @InnaciKorushka
      @InnaciKorushka 5 หลายเดือนก่อน +3

      @2DarkHorizon from an analog perspective, getting the physical components to relay information in a specific pattern, limiting the voltage and amps, tracing routes, designing various limiters, repeaters, etc etc. There is a LOT that goes into a "simple" circuit, let alone an actual graphics card. Not to mention, he made a custom engine. It's significantly simpler than the modern ones used in most games, however, it's not necessarily the software side that is difficult. I'd say programming is much easier to learn and understand that physical component binary and signaling. It's much less intuitive.

    • @2DarkHorizon
      @2DarkHorizon 5 หลายเดือนก่อน +1

      @@InnaciKorushka alot of people get into software these days but i figure the hardware and electrical isn't as hard people think once people are versed in it. Not saying electrical is easy it is properly harder than software but you know what i mean there is a standards to hardware circuits you follow to get things done

    • @xxsemb
      @xxsemb 4 หลายเดือนก่อน

      @@InnaciKorushka he used an fpga, it greatly simplified things.

  • @UA10i12
    @UA10i12 7 หลายเดือนก่อน +463

    But can it run Crysis?

    • @0thewings
      @0thewings 7 หลายเดือนก่อน +49

      But can it run DOOM?

    • @Radulf666
      @Radulf666 7 หลายเดือนก่อน +60

      @@0thewings Doom could run on a potato, so probably yes.

    • @hammerth1421
      @hammerth1421 7 หลายเดือนก่อน +33

      Crysis uses DirectX 10, so no, it can't.

    • @ArtemGms
      @ArtemGms 7 หลายเดือนก่อน +4

      @@hammerth1421 🤓

    • @bricktasticanimations4834
      @bricktasticanimations4834 7 หลายเดือนก่อน +1

      What about Return To Castle Wolfenstein?

  • @matthewlozy1140
    @matthewlozy1140 7 หลายเดือนก่อน +340

    Dude has a big career at AMD, Nvidia, or Intel in the near future. Smart dude.
    Edit: looks like he works at Respawn

    • @smalltime0
      @smalltime0 7 หลายเดือนก่อน +30

      lol he isn't a kid

    • @w4hid
      @w4hid 7 หลายเดือนก่อน +32

      or he can start his very own gpu company

    • @matthewlozy1140
      @matthewlozy1140 7 หลายเดือนก่อน +16

      @@smalltime0 who cares how old he is.

    • @demp11
      @demp11 7 หลายเดือนก่อน

      Thats extremly tuff, i mean even intel is struggling to make it.​@@w4hid

    • @mrnorthz9373
      @mrnorthz9373 7 หลายเดือนก่อน +29

      ​@@matthewlozy1140i think he has 2 decades of experience as a software engineer. It doesnt mater how old he is but you wouldnt cal him a kid

  • @309electronics5
    @309electronics5 7 หลายเดือนก่อน +19

    Really cool that its a fpga! Fpga's are soo usefull they are litteraly the playground for chip designers and people who like making their own designs of chips and have the ability to erase/patch them without it being burned into the silicon. Fpga's are used to "emulate" old game console hardware for example if its dead or propiertary and are in some game consoles. They are even used in osciloscopes to do the processing and to hold most of the scope logic. Although it depends on what fpga you pick, they can be pretty expensive

  • @baomao7243
    @baomao7243 7 หลายเดือนก่อน +148

    After doing all the hardware and software design and development…
    How do you not register at a university … then write it up, “defend,” then get a “free” Ph.D. out of this?

    • @xBintu
      @xBintu 7 หลายเดือนก่อน +32

      because university certificates don't mean anything when you're independently...

    • @arthemis1039
      @arthemis1039 7 หลายเดือนก่อน +2

      He could be Doctor Honoris Causa in some places for his work.

    • @baomao7243
      @baomao7243 7 หลายเดือนก่อน +7

      @@xBintu I actually do not tend to think of university titles as having much meaning anyway. I generally view tech university titles as both a trophy and a certification of an “original contribution to knowledge;” in other words, a union card and perhaps a “license to learn.”

    • @jshowao
      @jshowao 7 หลายเดือนก่อน +2

      Because PhD requires original research and none of what this guy did would fall into that category probably. Maybe tricking the windows driver would qualify, but making a 1990s graphics card with already known render algorithms probably wouldn't.

    • @jshowao
      @jshowao 7 หลายเดือนก่อน +3

      ​​@@baomao7243Well, you'd be wrong as my university education taught me a lot of things including FPGA design.
      It bothers me a lot that people view university this way, especially when many of them probably didnt even go to university and just regurgitate what everyone else thinks.

  • @dankoga2
    @dankoga2 7 หลายเดือนก่อน +12

    Nice you showcased a homebrew passion project of one hobbyist. They need all exposure they can get.

  • @l3v1ckUK
    @l3v1ckUK 7 หลายเดือนก่อน +35

    I recognise that screenshot.
    Descent!
    A fantastic game from the mid 90's. I spent a lot of time playing that (with the sound off) when I was supposed to be revising for my GCSEs.

    • @kstarler
      @kstarler 7 หลายเดือนก่อน

      Do you think that was Descent 2 or 3? It looked like 3 to me, but I do remember having Glide for Descent 2. Edit: Just double checked the watermark, and it is Descent 3. I always preferred 2.

    • @l3v1ckUK
      @l3v1ckUK 7 หลายเดือนก่อน

      @@kstarler
      I had descent 1 at home, then bought 3 when I was at university a few years later. I'm assuming that screenshot was 2 as I recognised the HUD, but not the actual level being played.

    • @TheRealSkeletor
      @TheRealSkeletor 7 หลายเดือนก่อน

      Ah, yes. The original "DooM killer".

  • @aetheralmeowstic2392
    @aetheralmeowstic2392 7 หลายเดือนก่อน +133

    He should add native support for Direct Draw, that'd make it great for RPG Maker games!

    • @RandMV
      @RandMV 7 หลายเดือนก่อน

      The modern GPUs don't support it?

    • @puzzle9648
      @puzzle9648 7 หลายเดือนก่อน +4

      @@RandMV FuryGPU doesnt tho

  • @momomaz2516
    @momomaz2516 7 หลายเดือนก่อน +27

    Now were taking about "diy" computer

  • @SaiGuy_
    @SaiGuy_ 7 หลายเดือนก่อน +119

    Only GPU without spyware

    • @danielforeman8934
      @danielforeman8934 7 หลายเดือนก่อน +25

      Ohh good point, I'll duplicate the github and make drivers with spyware. Can't go missing that feature!

  • @I2ed3ye
    @I2ed3ye 7 หลายเดือนก่อน +8

    The number of different skillsets and expertise this takes for one person to produce is just astounding and incredibly impressive. Dylan Barrie deserves so much recognition for such an accomplishment.

    • @2DarkHorizon
      @2DarkHorizon 5 หลายเดือนก่อน

      He could write a book how he made it it would be a massive important resource in teaching others.

  • @MrA6060
    @MrA6060 7 หลายเดือนก่อน +8

    youtube glitched and all i heard was "he had to install over 4 capacitors" and i was damn that sounds about right

  • @clebbington
    @clebbington 7 หลายเดือนก่อน +1

    thank you guys for covering such a cool passion project! would love to see more coverage like this. would even like to see some interview footage if the person is comfortable with being on screen

  • @dogwithoutw
    @dogwithoutw 7 หลายเดือนก่อน +40

    Finally, these guy is getting recognized by bigger media. I hope someday, competition gets so high we come back to 1080 ti times of price to perfomance

    • @doclangaming4076
      @doclangaming4076 7 หลายเดือนก่อน +7

      900 and 1000 series cards really were special in terms of value for money

    • @weil46
      @weil46 7 หลายเดือนก่อน +1

      This will never happen as much as greedy these companies for the future AI!!!!! . Prices will increase since people who have money will buy.

    • @GangnamStyle33
      @GangnamStyle33 7 หลายเดือนก่อน

      @@weil46 A.I. is a joke.

    • @QuackZack
      @QuackZack 7 หลายเดือนก่อน +6

      If anything the crypto boom of 2020/2021 assured NVidia and even AMD that they can spit any inflamed number in your face and you'd eat it and happily buy the product.

    • @weil46
      @weil46 7 หลายเดือนก่อน +2

      @@QuackZack as i said the problem from the consumers, their will always be those who act with ego to buy whatever shit these companies make for any price.

  • @michaelhuss0
    @michaelhuss0 7 หลายเดือนก่อน +28

    I'm a little surprised he didn't take the Intel Larabee approach and use a software rasterizer without the FPGA... he really went above and beyond for this project. Very cool.

  • @bader51500
    @bader51500 7 หลายเดือนก่อน +7

    I wish he had opened sourced it, so if there's a company that would carry on his work, they would be obligated to open source their work, too.
    That way, he could be an entry point for companies that wish to rival in the GPU industry

    • @smorrow
      @smorrow 7 หลายเดือนก่อน +4

      I think you're confusing open source and GPL.

  • @seanplace8192
    @seanplace8192 7 หลายเดือนก่อน +4

    I wonder if this could be a good solution for classic PC gaming. It's pretty easy to find old CPU's, but old video cards are sometimes hard to come by if you're wanting a specific kind. With an FPGA, it could just be a matter of loading the correct configuration file and have the card re-program itself!

  • @evilmasterskywalker
    @evilmasterskywalker 7 หลายเดือนก่อน +1

    I like this format, it would be cool to see more of these. just deep diving to what other people are doing and making a short story

  • @Blinkerd00d
    @Blinkerd00d 7 หลายเดือนก่อน +1

    Im a EE and do circuit design, and gotta say.... bruh, ur killin it.

  • @neoncyber2001
    @neoncyber2001 7 หลายเดือนก่อน +2

    I love this so much! You can learn to do anything if you are determined enough!

  • @Ericobab
    @Ericobab 7 หลายเดือนก่อน +1

    It would be an amazing case studie for engineer students and the A+ would be given to the best fps/ electric consumption ratio on a specific bench for all the class.

  • @ErrorName001
    @ErrorName001 6 หลายเดือนก่อน +1

    0:42 oh my god I saw Ed sheeran finally 😢😢

  • @thelaughingmanofficial
    @thelaughingmanofficial 7 หลายเดือนก่อน

    A lot of the tedium could be taken out of the soldering by using a Pick and Place machine.

  • @ScalebMF
    @ScalebMF 7 หลายเดือนก่อน +3

    field programable gatorade? 0:34

    • @bread1778
      @bread1778 3 หลายเดือนก่อน +1

      Heard the exact same thing too

  • @JasonFowler
    @JasonFowler 7 หลายเดือนก่อน

    This dude deserves all the attention. It's so awesome to see this. Mind blown.

  • @TreacherousFennec
    @TreacherousFennec 5 หลายเดือนก่อน +1

    this must be parallel universe equivalent of the Shovel AK guy

  • @dreamcrafter888
    @dreamcrafter888 7 หลายเดือนก่อน +4

    5:01 "Acutal footage" 😂

    • @RetroDotTube
      @RetroDotTube 7 หลายเดือนก่อน +3

      But it is

  • @DepressedMusicEnjoyer
    @DepressedMusicEnjoyer 7 หลายเดือนก่อน +1

    So cool to see a project I saw this guy developing in a small discord make it to here

  • @BloodiTearz
    @BloodiTearz 7 หลายเดือนก่อน +1

    ZZ9000 is a home built fpga based graphic card for the big box amiga ^^

  • @z1g
    @z1g 7 หลายเดือนก่อน

    This is pretty cool. There is a guy(MNT Research)that did this for the Amiga a while back since Graphics cards for the Amiga platform are almost impossible to find and when you do they are over 1K US in most cases. He uses the same FPGA. I admire the brain power that goes into something like that. I also admire the brain power to tie shoes though. Where did my Sketcher slip-ins go?

  • @TomsLife9
    @TomsLife9 7 หลายเดือนก่อน

    more broadly, the Zync is an MPSoC -multiprocessor system on a chip: FPGA, ARM processors (usually several cores), and a few RTOS (real time operating system) cores. FPGAs are not persistent and need to be flashed/programmed every time they are powered up. this is where the ARM running LInux comes into play: as part of its boot sequence, it can run the software to flash the FPGA

  • @pkt1213
    @pkt1213 7 หลายเดือนก่อน

    Absolutely mad lad! Also, seeing Decent! reminded me I need to see if I can install that on the kid's computer.

  • @NickB_864
    @NickB_864 7 หลายเดือนก่อน +7

    This is so impressive!

  • @ant9610
    @ant9610 7 หลายเดือนก่อน +2

    maybe there's a future full of FOSS and Libre hardware :)

  • @dekoomers
    @dekoomers 7 หลายเดือนก่อน +1

    This is cool! we been led to believe that only big companies can produce products like these. so good to see you can DIY graphic cards!

    • @RippedSocket
      @RippedSocket 7 หลายเดือนก่อน +1

      Honestly, it's not far off from the GeForce 256 from 1999. They didn't have the term GPU yet, so they called it a "single-chip processor with integrated transform, lighting, triangle setup/clipping, and rendering engines that is capable of processing a minimum of 10 million polygons per second".

  • @Rmm1722
    @Rmm1722 7 หลายเดือนก่อน +3

    Awesome work by him 👏💯🎉

  • @ratedRblazin420
    @ratedRblazin420 7 หลายเดือนก่อน +1

    Watch out letting LTT borrow this to review it, they might 'accidentally' auction it off for charity

  • @SlavTiger
    @SlavTiger 7 หลายเดือนก่อน +3

    Been a while since ive seen someone design a video circuit outside of 8 or 16 bit computing

  • @feederbrian9457
    @feederbrian9457 7 หลายเดือนก่อน +1

    Field Programmable Gatorade? Didn’t realize how far they’ve come in hydration beverage technology.

  • @himawariuzumaki1320
    @himawariuzumaki1320 5 หลายเดือนก่อน +2

    6:07 and MONEY since you are not paid to do this

  • @smorrow
    @smorrow 7 หลายเดือนก่อน

    2:45 It doesn't have to, you can rebuild a 3D printer into a pick-and-place

  • @StephenButlerOne
    @StephenButlerOne 7 หลายเดือนก่อน

    I've had one of these paperwork films on my pad pro 2nd ed since day one. So 4 years. They are very good.

  • @rohansampat1995
    @rohansampat1995 7 หลายเดือนก่อน +1

    Great video, you guys also mentioned limits of FPGA which is a real step up for this channel. Do you guys know if the Verilog is available? With that u could order an ASIC right? or is would that require silicon manufacturing that we no have

  • @Harry_Bl44346
    @Harry_Bl44346 7 หลายเดือนก่อน +1

    Well done and great work!!

  • @ComikelZero
    @ComikelZero 3 วันที่ผ่านมา

    I wish he open sourced it because this would've been a big help to a project I'm going to start working on.

  • @Videoman2000
    @Videoman2000 7 หลายเดือนก่อน +1

    You are describing my day job. I have 20 years of experience in FPGA programming. It's nice as a hobby project, with that he will certainly be hired as an FPGA developer.
    (Not as GPU programmer for Nvidia or AMD, that's ASIC design and totally different ball game in the restrictness and procedure domain)

    • @ThatJay283
      @ThatJay283 7 หลายเดือนก่อน

      altho it makes sense he chose FPGA instead of ASIC here, because FPGAs can be reprogrammed while ASICs can only be replaced, and this is a prototype.

    • @Videoman2000
      @Videoman2000 7 หลายเดือนก่อน +2

      @@ThatJay283 Yes, totally. But the video makes it sounds that it is totally revolutionary and special. Everbody in my team should be able to pull the FPGA part. (The HW and SW part are handled by other teams)

  • @moltres42
    @moltres42 6 หลายเดือนก่อน +1

    he should document and distribute this info...imagine what would happen if he had help
    coders might clean and optimise the information, add more support
    hardware tech people could help it to be built better, maybe even going back and designing better traces and such

  • @yensteel
    @yensteel 7 หลายเดือนก่อน

    This is such a blessing for tinkerers and a surprise! Many years ago, there was a seperate attempt in which the guy gave up. He documented everything. It didnt reach 3d gaming.
    Theres a lot of lessons to be learned, and a potential uni project to graduate!

  • @AlexeyFilippenkoPlummet
    @AlexeyFilippenkoPlummet 7 หลายเดือนก่อน

    There was so much hate towards Linus and his projects (for only 1 real frak-up), but Tech quickie format and content is one of the best I've ever seen in this field. Grabbing interesting and useful stuff, explaining it for toddlers (let's be honest, guys, that's the only way we would understand anything) and making it short and fun too. Keep up the good work!

  • @GraveUypo
    @GraveUypo 7 หลายเดือนก่อน +2

    sometimes i find myself thinking "what if every tech company went bankrupt and lost their tech". well, i guess we'd be okay. while we wouldn't recover immediately, i don't mind going back to 90's level tech to be honest.

  • @johnmartin1024
    @johnmartin1024 7 หลายเดือนก่อน

    Dear Techquickie Team, Thank you for this awesome roll-your-own GPU video. I thoroughly enjoyed all the technical details that were included. John M.

  • @Umski
    @Umski 7 หลายเดือนก่อน

    Wow, props to him for going the whole hog with this - closest I can get to doing some hands-on on electronics as an EE these days is knocking up the odd 555 timer circuit PCB 😂

  • @159tony
    @159tony 7 หลายเดือนก่อน

    Its an impressive feat honestly. Especially writing out custom drivers.
    Personally if i dont need massive performance id just use my phone, a windows container wrapper and emulate older games that way.
    Every smartphone today trounces even some the most high end of pcs from 2010.

  • @Henry14arsenal2007
    @Henry14arsenal2007 7 หลายเดือนก่อน

    Imagine the knowledge required to make even such a basic GPU alone. Now imagine how much more complex actual modern GPUs are.

  • @xeschire706
    @xeschire706 7 หลายเดือนก่อน

    I've been wanting to do something like this, but with either a modified risc-v based architecture, or something crazier like a custom implementation of non other than the 6502 architecture, then modify & optimize it for graphics proccessing to create my own custom gpu & it's respective architecture. Of course, I'm only going to be aiming for low powered devices such as, microcontrollers, custom retro style games consoles & handhelds, also old or retro computers in general as a starting point before my homebrew gpus can be taken any further.

  • @Enlelgaming
    @Enlelgaming 7 หลายเดือนก่อน +2

    At this point someone is going to build gta6 before official release

  • @sharveshasreerajsreemurugan
    @sharveshasreerajsreemurugan 7 หลายเดือนก่อน

    He took "Fine, I'll do it myself" to the next level 🔥

  • @AFRONY791
    @AFRONY791 6 วันที่ผ่านมา

    good luck to barrie in improving this GPU

  • @LimbaZero
    @LimbaZero 7 หลายเดือนก่อน

    Some of FPGA devboards has HDMI output, right cooling, and almost looks like GPU but also have DIMM slots etc. but those usually cost 3-10 ke

  • @BPBomber
    @BPBomber 7 หลายเดือนก่อน

    “I mean, you can, you’re just not gonna have a good time.” 😂 lmao Riley cracks me up

  • @SteveNetting
    @SteveNetting 7 หลายเดือนก่อน

    It would be really great to see the zz9000 covered too, as I guess that was a similar engineering process, albeit with additional ARM cores and RAM.

  • @Mailmartinviljoen
    @Mailmartinviljoen 7 หลายเดือนก่อน

    "Since you cant just insert a bare FPGA into a bare motherboard" When I was a kid I stuck a famicom cartridge into an ISA slot in my dad's PC. I don't know what I expected the outcome to be. The PC didnt turn on anymore afterwards.

  • @thanatosor
    @thanatosor 7 หลายเดือนก่อน

    New guy into FPGA : I'm playing god right there.
    FuryGPU guy : No, it's just my experiment to learn more about hardware.

  • @dignes3446
    @dignes3446 2 หลายเดือนก่อน

    The performance is more like ~1999-2000 gpu level (if I am not mistaken) 60 fps for quake1 demo was pretty amazing I don't think there was ANY graphics cards in "mid 90s" that can pull that off...
    Most "I build me own GPU/VDP from scratch projects" are like late 70s to mid 80s level tech.

  • @DDracee
    @DDracee 7 หลายเดือนก่อน

    fpgas are mostly for prototyping, if he really wanted to he could get a cpu made from the architecture he designed on the fpga
    nvidia/intel/amd have the same work flow, they prototype architectures on fpgas and when they bench well they get the cpu/gpu made

  • @prosperomiponle7645
    @prosperomiponle7645 7 หลายเดือนก่อน

    I’m a computer engineering major and I remember reading about if it was possible to literally make a CPU on your own, and the overwhelming consensus was that those sorts of projects are always industry level investments so I shouldn’t bother. I had assumed that’s also the case for GPUs, but this is telling me there’s hope!

    • @Raletia
      @Raletia 7 หลายเดือนก่อน

      He's using a FPGA so it's mostly board design and designing and coding the circuits in the fpga. However the retro community is making lots of efforts to replace and recreate older custom hardware. Like cpus and special chips. I've seen people working on custom 8088 and 286 cpus.
      I also saw a video the other day of a guy trying to make an actual die, this was for a 16 pixel monochrome camera but the process is close enough to other types.
      It seems like a lot is possible now with enough drive and some luck, and maybe some creative investment/modification in resources and equipment.

    • @RetroDotTube
      @RetroDotTube 7 หลายเดือนก่อน

      CPUs can be created now!

  • @vladislavkaras491
    @vladislavkaras491 7 หลายเดือนก่อน

    Darn! He like did (almost?) everything from zero!
    That is really impressive what one man could do!
    Thanks for the video!

  • @sergshutk2757
    @sergshutk2757 6 หลายเดือนก่อน

    "Дайте мне точку опоры и я переверну мир" (с) Архимед.
    Была бы спонсорская поддержка и доступность оборудования, проблем бы не было собрать видеокарту.
    Первый шаг - спроектировать.
    Второй - создать образец.
    Третий - создать драйвера.
    Самый сложный - это первый этап. Как самый начальный и зачастую не знаешь что надо сделать. Когда начинаешь понимать как можно спроектировать видеоадаптер, становится всё намного проще.

  • @SheddysGaming
    @SheddysGaming 7 หลายเดือนก่อน +44

    Insert Jurassic Park 🏞️ meme about scientists thinking "could" instead of "should".

    • @mrnorthz9373
      @mrnorthz9373 7 หลายเดือนก่อน +4

      Except in this scenario the "should" is as certain as reality and the "could" is still unknown..

    • @milasudril
      @milasudril 7 หลายเดือนก่อน

      Can it run SGI Fusion?

    • @SheddysGaming
      @SheddysGaming 7 หลายเดือนก่อน

      Fair. Not every exercise in futility is a waste of time.

    • @GoldenBeans
      @GoldenBeans 7 หลายเดือนก่อน +2

      Picture this: an oppresive middle aged kingdom
      the kingdom has many elites opressing peasants who are defenseless, they own the weapons after all they can do whatever they want
      however one day peasants come up with idea of making pitchforks
      the peasants while much weaker with crude farming tools as weapons can defend themselves to some degree
      this is an analogy why making open source technology is not a waste of time
      as companies slowly but surely start to get stingier and buying hardware for the regular consumer becomes less viable
      an alternative on the way is always a good thing, "fine ill do it myself"

    • @SheddysGaming
      @SheddysGaming 7 หลายเดือนก่อน +1

      @@GoldenBeans I will admit that, while I'm not certain this is the white knight solution you're portraying it as, I will agree that the effort is less of a waste of time than I originally considered it to be. Touche`

  • @MatthewSuffidy
    @MatthewSuffidy 7 หลายเดือนก่อน

    It is an interesting idea, I was just searching for fpga and gpu. The problem is though he has spent several years making an early 90s gpu instead of just getting a 4090 or something. Realistically this would become useful if someone wanted to try to start optimizing a memory bus or something. Really if you want to start competing with the giants and it would need more effort. In a way ARC was doing that.

  • @JsemPO12
    @JsemPO12 7 หลายเดือนก่อน +2

    He could make it open-source.

  • @epickh64
    @epickh64 7 หลายเดือนก่อน

    "You are only truly 'epic' if your name is written lowercase"
    becomes
    "You are only truly 'epic' if Techquickie made a video about you"

  • @Tonba1
    @Tonba1 7 หลายเดือนก่อน

    You can actually commision tsmc to make a wafer of your own design mind you it will not be cheap if I remember its like 1300 per wafer (depending on process node) and I think the minimum order is like 100 wafers

  • @MaverickBlue42
    @MaverickBlue42 7 หลายเดือนก่อน +1

    I thought trace lengths were more about signal timing rather than signal integrity...

    • @blahorgaslisk7763
      @blahorgaslisk7763 7 หลายเดือนก่อน +1

      It can be said that timing is a part of signal integrity as there are several signals and they have to be processed in parallel. Now that's a bit frustrating as PCI-e uses serial data to avoid the problems that parallell signals have. But anyway you look at it the signals has to be available at the same time. Now signals traces can't be located immediately next to each other or there will be noise inducted in the neighboring traces. Same with the zigzag of traces to make them longer. Those can't be to tightly packed or there will be inductance generating ghosts of the signals in the same trace. As frequencies go up this just gets worse and worse. Even at pretty low frequencies this can cause problems if the traces are long enough. Once a long time ago I had to lengthen the ribbon cable between a POS terminal and it's keypad. Not particularly high frequencies here and decent voltages in the signals. And yet it was enough to occasionally corrupt the keypresses. Just split the cable into individual strands, bunched them upp a bit with zip ties and it worked every time. This was a temporary fix as the shop were going to replace their desks but at the time I had to make the keypad fit into an old pocket on the desk. It was an ugly hack but the customers couldn't see anything of it.
      At the time I worked with component level repairs on a huge multinational computer company. I spent a lot of time studying signal integrity and I had loads of oscilloscopes almost always wired to something that didn't do exactly what it was supposed to do.

  • @arisakathedappergoose4796
    @arisakathedappergoose4796 7 หลายเดือนก่อน +1

    so, new code to emulate old hardware on old rigs for older games?

  • @rnelson1415
    @rnelson1415 7 หลายเดือนก่อน

    Back in my day, FPGA meant flip-chip pin grid array.

  • @JoeCensored
    @JoeCensored 7 หลายเดือนก่อน +1

    Calling it the Fury GPU is confusing for a GPU which runs 90's games, seeing that one of the most popular GPU's from the era is the ATI Rage Fury, commonly simply called the Fury at the time. It took me almost half of the video to understand this has nothing to do with the old ATI product.

  • @wesleyfilips7052
    @wesleyfilips7052 7 หลายเดือนก่อน

    1:03 within cells interlinked

  • @sativagirl1885
    @sativagirl1885 7 หลายเดือนก่อน

    Linus would never expect his staff to prank him or optimize his disc drive,

  • @richieqs7789
    @richieqs7789 7 หลายเดือนก่อน +4

    Does it run on TempleOS?

  • @KeritechElectronics
    @KeritechElectronics 6 หลายเดือนก่อน

    Fury GPU? A nice homage to ATI Rage Fury... back in the time of the very first GeForce launch, when Radeon was not even a thing!

  • @Jokerwolf666
    @Jokerwolf666 7 หลายเดือนก่อน

    The thing that makes me sad is there's enough computing power in a standard modern GPU that it could run its own operating system and be its own computer but these chip manufacturers don't seem to want that

  • @Keys_9914
    @Keys_9914 7 หลายเดือนก่อน

    I want a longer video about this

  • @Maku98_
    @Maku98_ 7 หลายเดือนก่อน +1

    5:00 minor spelling mistake

  • @londonquares
    @londonquares 7 หลายเดือนก่อน

    This needs more attention!!!

  • @Aeturnalis
    @Aeturnalis 7 หลายเดือนก่อน +6

    4:04 skip ad

  • @chrismazur6148
    @chrismazur6148 7 หลายเดือนก่อน

    My respect to him. Not everybody has this mind to do it.

  • @SimoAtlas
    @SimoAtlas 7 หลายเดือนก่อน

    He should build a commercial version and compete with big boys as fully open design free open source GPU

  • @alexmihai22
    @alexmihai22 7 หลายเดือนก่อน

    I may be interested for one on simple PCI to use it in a Pentium 3 that doesn't have AGP. With 128MB, Windows 2000, XP, maybe 98 drivers, and to perform similar with a FX-5600. I may be interested in buying one if really does the job in games of that period.

    • @DoubleMonoLR
      @DoubleMonoLR 7 หลายเดือนก่อน +1

      They reportedly made standard PCI(ie: not PCI Express, though they were also available) FX-5600 cards at the time, along with other fast(for the time) PCI cards.
      You may get CPU limited anyway with these faster cards, though it may depend on your specific cpu.

  • @KonuralpBalcik
    @KonuralpBalcik 7 หลายเดือนก่อน

    There are many states that want to develop such a thing in their country, China comes first, and Turkey is definitely looking for such a thing, even the former 3DLabs developer Turk is still in the country. ( Yavuz Ahıska and Osman Kent )

  • @Xiph1980
    @Xiph1980 7 หลายเดือนก่อน +1

    The board took him one month?? What the!
    That's frikkin insanely fast!

    • @spookycode
      @spookycode 7 หลายเดือนก่อน +1

      He is an absolute 100x guy

  • @Kappi1997
    @Kappi1997 7 หลายเดือนก่อน

    At university I made a very simple gpu on a fpga evalboard as well. It was able to run a own version of pac man :D

  • @bluenexus1212
    @bluenexus1212 7 หลายเดือนก่อน

    That’s wild I was thinking if anyone could do this a couple days ago!

  • @maxmouse3
    @maxmouse3 7 หลายเดือนก่อน

    If someone told me that they would build their own GPU I'd say he/she was crazy, my man actually did it!
    Amazing! Great to see! Thanks for sharing it.
    I know FPGAs are incredible but I thought a GPU was impossible. I'm glad I was wrong

    • @Einar979
      @Einar979 7 หลายเดือนก่อน +3

      Graphic processing with FPGA's aren't really a new thing. There already exist several designs that emulates different Nintendo systems you can find online, and within specialized equipment it's used all the time for graphic processing

    • @milasudril
      @milasudril 7 หลายเดือนก่อน

      Well, an FPGA could be quite feasible as a base for implementing a GPU. All you need is tons of ALU:s, not many flip-flops. I guess that a performant CPU would be much more challenging, since it needs to keep much more state between instructions.

  • @danielforeman8934
    @danielforeman8934 7 หลายเดือนก่อน +1

    That's a SMD device, so less soldering iron and more reflow station.