Hope you enjoyed the video. Like and subscribe, etc. etc. For other company profiles, check out the playlist: th-cam.com/play/PLKtxx9TnH76Qod2z94xcDNV95_ItzIM-S.html
Excellent review of the history. seeing the Doom and Quake reminded me of the simple days of getting doom running and just wondering what that noise was behind me. :)
My dad worked at Zeng Labs starting in 1996 before it was bought by ATI. Stayed on until Ati was bought by AMD, and stayed on at AMD during the GPU wars. He left AMD in 2005 or 2006. His team built the first Radeon chip.
@@truthsocialmedia I think you might be confusing something. V2200 is entirely Tseng-free. And Tseng never finished its 3d chip, at least not before going belly up. But it was indeed fairly useless.
One correction: NVIDIA did coin the term GPU in 1999. GPU is short for Graphics Processing Unit (not General Processing Unit). GPGPU is the term for General Purpose computing on GPUs. (as far as I know it was not coined by NVIDIA though) General purpose computing on GPUs started to become more common after programmable shaders were introduced in 2001, with the NV20. Great video. Loved hearing again about the early days of NVIDIA. There's a lot more to the story for sure, but this hit all the right notes, thank you! I've worked for NVIDIA from 1999 through today. I lived through a good portion of this, it was (and is) exciting.
I used to work in the video card industry in the valley. I helped design video cards and all chip companies like Nvidia, Tseng labs, Rendition and 3D labs would come and show us their latest wares. We actually made a 3DFX Voodoo card that sold very well. At one point I put together a video card that was both PCI and AGP on a single card. You would just flip the card over and change the bracket to use the card with the other bus. Wow such memories.
You mention the IBM PGA, but your presentation seems to ignore other 2D graphics adapters (with and without acceleration) that rose to prominence with Windows before 3D graphics. Also there were actually a number of graphics chip design houses that were already on the scene creating chips with 3D acceleration (or deceleration depending on who you ask) when nVidia and 3dfx arrived on the scene. Admittedly none were particularly good by comparison. Also every last IHV on the market was writing device drivers for their graphics accelerator chips. The system OEMs didn't write squat. OEMs contracted out all the productization and customization to board manufacturers which hired their own device driver development teams for that purpose. At least that was true in the '90s. Admittedly nVidia always had a substantial device driver development team practically from the get-go. But that was actually for self-serving reasons that probably weren't apparent to the public. The early nVidia designs had something unusual - it's own integrated command execution pipeline tied to it's own DMA channels. However all of the less common commands in the pipeline were actually virtualized and farmed back out to the host CPU for simulation. To accomplish this virtualization, nVidia needed a much larger driver team and that team needed to be more involved with the silicon design team. That's actually what initially drove their vertical integration - they just couldn't rely on board manufacturers to address issues in this chip virtualization system - though at first they tried. Also about the demise of 3dfx: Before 3dfx approached it, STB Systems was probably the largest or second largest contract graphics board manufacturer in the world. All the major OEMs bought cards from STB Systems. But unlike companies like Diamond and Matrox, STB Systems did not sell self-branded retail products to the public pre-merger. Instead its business model was to take sample silicon from graphics chip makers, spin a board and customize the driver with it's own library of graphics optimizations. (The optimizations were the secret sauce it used to sell to OEMs because they impacted benchmark numbers.) It would then offer these sample boards up the OEMs and each OEM would order SKUs, usually from the two best performing chipsets. This model kept STB System's Mexico PCB factory line near 100% capacity for several years. Before 3dfx made it's offer, STB Systems had seen huge success with both the nVidia Riva 128 and the TNT. At the time of the merger announcement about 90% of STB Systems' sales were nVidia TNT-based boards and every major system OEM was buying them. Post-merger announcement nVidia obviously refused to offer any new chip designs to STB Systems. What's worse 3dfx had never been forced to meet an OEM development cycle and even if it had, their new Banshee was at best competing with the nVidia TNT (and not even beating that) and not the current generation silicon. When 3dfx and STB Systems merged they were flush with something like $100M in cash. However STB Systems had done a capital lease financing arrangement on it's Mexico production line and those multi-million dollar lease payments had to be made each month whether the production lines were producing products 3dfx/STB could sell or not. It didn't take very long before the fabs in Mexico were idle and the company was staring bankruptcy in the face, because the few wholly 3dfx-based boards they produced sold only a tiny fraction of what the Mexico fabs could spit out. Also STB Systems had just built a new headquarters that it had to pay for and all those engineers STB and 3dfx had on staff didn't work for free. So it wasn't too long after the merger that they went looking for suitors. nVidia cut them a deal to buy out their intellectual property and hire their device driver and firmware developers. The remains of 3dfx entered bankruptcy and the support staff were shown the door. Regards, Joel Corley, Windows Device Driver Developer, Recently Retired from Microsoft, Formerly a Developer for STB Systems and 3dfx...
My uncle used to work for SGI back in the early 90s and he said that at some stage he remembers some people pitching the idea of a consumer graphics card and they basically got laughed out of the room. He was also telling me they basically invented video on demand but at the time the bandwidths available were too low to be practical and the project was eventually abandoned.
I can confirm the first point. It happened during the worldwide sales conference around 1992 or 1993. Audience of about 500 sales and engineers. Top management dismissed the idea. Some engineers mumbled their disagreement in the back of the room. The rest is history...
Regarding VoD, that is right too. But it was not abandoned, it was spun off into Kasenna and had many commercial successes, including at Tier 1 operators.
@@TheUtuber999 Not quite but a few SGI engineers were listening and agreed, those are the guys left shortly after and founded 3dfx, using their experience to design that first Voodoo chip.
B2B and B2C distribution are very different beasts. SGI may well have been capable of solving the technical challenge but then there is the go-to-market strategy that has to match the customer. Jon gave us right in this video the example of how you can fail with 3dfx trying to go from B2B (selling chips to gfx card makers) to B2C (selling finished cards to consumers). Fail you will if not prepared for differences in sales and marketing approach.
From my experience in the industry it was how Nvidia released driver updates for their cards. Sometimes 5 years after they stopped selling them which was practically forever compared to the other players who released graphics cards with buggy drivers and then never fixed them and stopped releasing updates after 1 year. In the early days of windows a graphics driver problem would result in a BSD and loss of work and no real clue what the actual problem was with the PC. I saved the company I worked for thousands in warranty repairs by only using reliable and supported graphics cards and ditching all the other brands. Just about all the other PC components if there was a fault would clearly show themselves. Graphics card problems cost a fortune to sort out, customers bringing back machines that worked all day in the workshop but would BSD when the client was doing some unusual operation, I would return the cards to the OEM, they would test them and send them back. A nightmare of epic proportions.
Such a trip down memory lane ! I remember living through it all, but of course back then we didn't have all this information about how it all came to pass. Thanks, man, I enjoyed that.
I had Nvidia's Riva 128 in my first computer, but immediately bought Voodoo 2, because Riva was such a trash. It only sell as OEM graphics card. I don't even know if any game actually worked well with it. Not that it was NVidia's fault because DirectX was such a bad joke at that time: "Plug and play" -> "Plug and pray" "3D accelerator" -> "3D decelerator" At least DirectX had build-in software rendering support, so you could play games with it as the last resolt.
There's clearly a follow up to this as NVidia GPUs begin to get used in things like neural networks, crypto mining and cutting edge supercomputers... You could argue that without video gaming we wouldn't have had 'the third wave' of AI we now see is so transformative.
Agreed. The never ending quest by gamers for realism (physics, compute shaders, etc) helped pay for the R&D that later opened up all those other needs.
There's also a follow up to this as Nvidia is now not so friendly with its board partners (AIBs) anymore. You just had one drop out who was a bigger name in N. America, EVGA. They're clearly trying to move to where they manufacture their own products and they don't have any board partners in the future. The latest boondoggle almost shows the lack of concern between Nvidia and AIBs, which is the move to a high power 12V power connection from the PSU to the GPU. Nvidia I'm sure did a lot of research with this connection and for their branded boards the way they're mounted is different than probably all the AIBs are, and now AIB GPUs are creating a fire hazard for the brand new RTX 4090. Meanwhile Nvidia GPUs don't have a problem. Nvidia is probably the company that wanted the AIBs to move to this high power connector (AMD doesn't use it) because Nvidia developed the standard for it, but most likely failed to give the AIBs ALL the research data that went along with the development of this connector.
Nvidia going for General Purpose GPU computing with CUDA was a genius move. As a GPU computing expert doing physics simulations, I say it is now a significant performance and cost benefit in high performance computing sphere.
Didn't AMD try to do the same thing with OpenCL ? I'm a total noob compared to you, but I remember OpenCL being praised for being open-source, unlike CUDA. Did something go wrong ? Or is AMD competitive in that area ?
@@minespeed2009 I've recently started learning deep-learning and yeah, I'm a bit scared of being trapped in the Nvidia ecosystem the way I got trapped into the Microsoft ecosystem.
@@TheNefastor Just wait for the XILINX acquisition to be complete. OpenCL and FPGA processing will compete favorably against CUDA. ALSO cuda uses an api-oriented architecture while OpenCL is written in machine code and executed directly. So far NVDIA ecosystem has gained traction for sure. BUT AMD is on a roll and wound't be surprised if they somehow developed new synergies with their x86+GPU+FPGA ecosystem all under a single roof.
AWE32... The first time I played with a true MIDI synthesizer. It blew our socks off back then. I remember playing too much Rise of the Triad just because the music was so good.
Excellent suggestion given the legacy of Creative Technologies. A technology hardware company founded in Singapore and sufficiently 'global' to achieve a NASDAQ listing. This was truly rare 20 years ago.
Agreed! I still have an Audigy2 card - which was (and probably still is) superb - but no good driver support for the latest OS's. It did things that no sound driver/card I've seen in ten years has been able to do. It seems Realtek do things /just/ well enough that it's not worth shelling out for anything better unless you are a real enthusiast.
@@TheNefastor I was a little kid back then and everyone would crowd in front of the family computer to marvel at the interactive game play with sound effects better than the arcade. Somehow I wish I could mod my PC and play those games again.
@@daviidgray1681 Yep. Very rare indeed. I had a glimpse of the founder himself when he was a humble computer repairman in a rickety old shopping mall. A hardworking and honest man that repaired my brother’s computer. Little did we knew he would go on to much bigger things.
Also important was how Nvidia were able to recruit whole teams from SGI when Microsoft announced they were doing Fahrenheit with SGI - this was in the days when Microsoft also bought SoftImage and wanted to take over the workstation space with Win NT, causing SGI difficulties apparent to all their staff. It wasn't even seen as a big deal for SGI staff to defect. So they did, taking their knowledge of the fundamentals with them.
great video. Personally, I would've liked a section that goes more indepth into the recent years, as well as the ARM deal. Also, I'd love a video similar to this one for AMD, as well as one on the merger between AMD and Xilinx. Hopefully some of these topics will make it into future videos!
3:31 SGI was so far ahead of everyone at this point they could have easily been the size of Nvidia today. SGI created the graphical hardware for the N64, and the N64 dev kits where SGI work stations. What a incredible blunder they made not getting into the consumer space.
Insanely interesting, having grown up using computers in the whole period covered by this video, it's super cool to now understand all these brands, technologies, terminologies, strategies and decision making by the companies. Thank you so much for the information - I thoroughly enjoyed it. Please keep up the good work.
Didnt some of the top engineers at Nvidia come from Silicon Graphics Inc? I think its fascinating how people moved around from various companies. From the 70's-90's I think fewer than 100 people really shaped the whole home computer market. Probably less than 50 in the chip and motherboard design space.
Oh dude! I remember those old Vodoos they had a cable in the back of the pc that went from the video card into the 3d card. My first was the Banshee, 16mb of power! It was awesome at the time.
3Dfx, I haven't heard that that company name in a long long time. The first graphics card I ever bought was a 3Dfx. I remember buying it like it was yesterday. I hadn't played games on a computer since the Commodore days in the 80s, but I had very fond memories of playing games all weekend and summer with the other boys from neighborhood so when I was finally on my own I went and bought a PC and a game. I took it home and tried to play it. After getting practically zero framerate I read the box and realized I needed a GPU. I knew what they were, but had no clue how to shop for one so I started researching and over 20 years later I still haven't stopped. That 3Dfx card was what got me interested into how PCs work and ultimately had me going back to school to take as many classes as I could on the subject
I love this video, it's like what I lived through in the 90s trying to build and constantly improve my fankenstein machines. All the churn with the video cards brings me down memory lane (3Dfx, Intel 740i, D3D, OpenGL AGP vs PCI, Real3D, Voodoo 1, 2, Banshee, etc). Thanks!
damn, i have never once made the connection between Nvidia and the Latin Invidia. this is the second time this has happened to me. first time being the connection between Asus and Pegasus. so much for the creativity part of my brain.
Odd Matrox didn't get a mention. The Matrox Millenium was a top selling card for a while. Intel's MMX extensions was also a resource Quake utilized that helped Nvidia piggyback. But otherwise this doco was very on point.
Matrox was only able to compete for a while because the 2D performance of their chips and DAC converters for VGA was one of the best out there, especially at higher resolutions. But when more and more games shifted to 3D and DVI became a thing, they were left in the dust.
I love your channel! I'm a mechanical engineer who switched from heavy machinery to semiconductor fab equipment engineering (CMP). Your videos have been so helpful to me learning how all this crazy wafer stuff works.
Just one small addition: When Intel pushed onboard graphics, where the graphics memory was part of the main memory of the CPU, it was thought that the video solution would actually be faster, since the CPU would have direct access to the frame buffer, as well as having all of the resources there to access it (cache, DMA, memory management, etc). The reason they lost that advantage in the long run was the dual advantages of VRAM or dual ported video ram, a ram that could both be read and written by the CPU at the same time as being serially read out to scan the video raster device, as well as the rise of the GPU, meaning that most of the low level video memory access was handled by a GPU on the video card that did the grunt work of drawing bits to the video ram. Thus Intel ran instead down the onboard video rabbit hole. Not only didn't they win the speed race with external video cards, but people began to notice that the onboard video solutions were sucking considerable CPU resources away from compute tasks. Thus the writing was on the wall. Later, gamers only knew the onboard video as that thing they had to flip a motherboard switch to disable when putting a graphics card in, and nowadays not even that. Its automatic.
Shared CPU-GPU memory still is a good idea, sadly the graphics APIs didn't take advantage of it one bit and I think Intel's graphics drivers were always a bit crap. As such, only game consoles and some mobile devices properly implemented a memory pool with both cpu and gpu access. The Vulkan api seems to treat GPU and CPU cores symetrically, and shared memory on the ZEN platform should finally fix this on mainstream pcs. At this point, modern CPU cores support simd with AVX extensions, I don't know if there's any any point having separate graphics oriented APIs and architectures for gpus... the architectures seem to have merged towards eachother(GPUs getting more general instruction support and CPUs getting more parallel data instructions).
3Dfx was a typical cash burning startup in the Dotcom era of the late 90s. The captured the PC gaming market in the right time with simple and working solution, but they always had struggled with evolving forward, with unusually long R&D cycles, while relying too much on recycling their old and proven tech. But the alienating of the OEM partners would be their biggest undoing from a long string of missteps... and they had no chance to compete with Nvidia's 6-month release cadence.
The greatest move NVIDIA did was acquiring the marketing genius Brandon “Atrioc” "glizzy hands" Ewing. He pitched and developed the iconic Greenout campaign. He also pitched "Frames Win Games" Esports campaign, which is currently the flagship Esports campaign for NVIDIA and its hardware partners. Truly the GOAT of marketing
LOL. Nivea is pissing off a lot of people. They are going to stop producing their 30 series video chips this month. At least Intel is going to produce new video cards. There seems to be a never-ending insatiable appetite for video cards. I'm wondering when this shortage will end, if ever. They're saying the shortage will remain for 18 more months. At this point, I don't think people will care who makes the card or what the performance is as long as they get SOMETHING which is stable.
The Cromemco "Dazzler" was the first graphics card in 1976. (and you're about 10 years too late for the graphics revolution - the Amiga was doing all of this in 1985 as well as the GUI. The screenshot of GLQuake you show is actually the software renderer - it doesn't use bilinear filtering. If you go to the place you sourced the image you'll see you took the wrong screenshot - this is a software shot next to the GL shot for comparison)
Jeez - I remember all these graphic cards revolutions in the 90s. ATI, RIVA TNT, VOODOO 3Dfx. It was the time when the AGP standard was introduced before being replaced by PCI Express.
- Excellent. - Thx for all your effort, and talent. - I got into computer graphics in the early 90s, so, I experienced all the history you covered, but did not know any of the back story... until now. Thx, again.
Don't forget that Nvidia never stopped selling their GPUs to third party companies, so today there's a myriad of brands using Nvidia chips from where to chose.
I was young during the days of games like doom2, quake 1,2, Red Alert etc and I remember being able to run these games perfectly on our 486 machine without any discrete graphics installed. I know my dad and he hated the idea of paying extra for discrete cards so I'm sure there wasn't one in our system. We did have sound blaster tho and the games looked and sounded awesome!!
Great video and I loved knowing the history behind the giant. If I've read correctly, a GPU is a Graphics Processing Unit and not a General Processing Unit as you mention it towards the end of the video - en.wikipedia.org/wiki/Graphics_processing_unit. The rise of AI and ML sciences made these chips popular because it is faster in orders of magnitude than a regular CPU with more or less the same power consumption. Nvidia is leveraging its know-how in developing graphics chips to now capture this market.
0:15 dang. Nvidia is now #11 next to TSMC at 545B just right now. i dont see any reason except the rumors that Nvidia are slowing down production to maintain high GPU prices
it's a BS rumour that makes no sense lol, as nvidia sells the GPUs at the same price regardless of final price to consumers. Nvidia's stock price has everything to do with their dominance in data center and AI, very little to do with gaming sales even though these represent a good chunk of their revenue still.
@@elon6131 while i did not watch that rumor, Best Buy increased their prices on Nvidia cards. that is shocking new info. nvidia received flak from some of their investors when crypto crashed in 2018 and crashing the stock, arguing that Nvidia downplayed their revenue from graphics cards/mining in 2018.
@@zodiacfml and then nvidia got mad at them and they had to roll back. Though FE editions are the only ones nvidia directly sells, final retail price is not in their hand nor do they benefit from any markup the retailer puts on the product. The best buy increase was a best buy move (consistent with their pricing on the other cards, until nvidia had them roll it back). nvidia's FE msrp has not changed. i am aware they got flak after the crypto downturn, though it was most certainly not malicious as nvidia cannot know who buys their cards once they are in retailers hands (and lying on reports would be very, very illegal. you do not mess with the SEC).
Very informative! What a cool piece of history the GPU wars were! I remember playing Morrowind on the GeForce MX420 card back in 2002. I remember being dazzled by the water shaders. Give it 20 years from today and everything will be photorealistic. We're in for wild times coming into middle age.
1:15 - in 1991 triangle-database graphics engines where $M machines that needed a $M computer to run their databases and were used to provide displays for simulation flight trainers. I worked for a company that provided the simulation computer that requested a set of displays at 15/20/30/60Hz. Five years later those graphics machines were obsolete...
Very informative. In 1986 I joined a company in British Columbia, Gemini Technology, that had great ambitions to dominate the graphics card market. We designed an EGA chipset using Daisy workstations. I taped out the design at LSI Logic in Sept., 1987, a few months after IBM introduced the VGA. Earlier that year we had spent a month in Japan, at Seiko Epson's design center. I don't know why that effort failed, since their was little transparency there. After completing the design at LSI, I moved to Video-7, an up and coming graphics card company that had just completed a VGA chipset, and was selling graphics boards. My LSI Logic EGA chip, surprisingly, found some customers. At V-7 we examined the IBM PGA, trying to reverse engineer the chipset and I also helped integrate my previous design with a V-7 product. Gemini eventually failed and was absorbed by Seiko Epson. Video 7 merged with an LSI Logic division, but had some legal problems that required the CEO to wear an ankle bracelet. I continued designing custom chips and even ran into my former Gemini Tech. co-workers at a Silicon Valley design center. Most of all I enjoyed the time I spent in Japan.
The picture of the motherboard at 20:24, does it actually have on board graphics, or is it what I find to appear as, an 'APU' as AMD referred to them...?
For those interested: The IBM Professional Graphics Controller is called 'PGA' because the A can stand for either Array or Adapter. In modern English the letters 'A' and 'C' are commonly used interchangeably, much like the letters 'ß' and '§'. This can often be confusing for those who speak British English because IBM chose to use the uncommon 'Controller', instead of the classic spelling 'Aontroller'.
I worked at one of those 3rd party board companies in the mid nineties and so far this is the only presentation that gets it right. Great job. I was even offered a job at Nvidia who directly recruited me in their drive to vertically integrate graphics drivers. I am sad I said no. But I could tell that Nvidia was going to win it all, on my visits, they shared with me their secret inner sanctum where there computers were situated running that vaunted design simulation. Their record of getting chips out on first turn was unmatched. It's almost like your observation about TSMC - they would win with cycle times.
Excellent video. One thing you missed out was Matrox. I believe they were the 4th market player back in the early 2000s. Though Nvidia GPUs usually cost more than it's rival, stability is something I always look out for which both Nvidia and Intel provides it. The red team has time and time again disappointed me in its quality. The introduction of Radeon Vega APU though change the whole gameplay for budget PCs again. I hope Intel's newest foray into the GPU space will bring a new level of benefits for consumers. Here are some of the Nvidia GPUs that I have ever owned and used includes: - Riva TNT - GeForce 2 MX - GeForce 6600 GT - GeForce GTX 660 Ti - GeForce GTX 770 - GeForce GTX 1060 Unfortunately bloody crypto miners are ruin the GPU prices for gamers now.
Great video! There's also the issue where 3dfx kept suing nVidia who just patiently bore the brunt of the lawsuits until when they finally bought them. Its a good business lesson.. don't worry about doing things that your competitor might sue you for if you can plan to just eventually buy them out.
This is an exceptionally objective and accurate description of the circumstance at the time. I was there, man! 😂 Of course I loved 3dfx. Doom sold more hardware than anything!
What a great video, I’ve just arrived in Taipei and hopefully after government quarantine I’ll get out to do some tech related sight seeing. If any one has any recommendations please let me know.
I can only speak to my experience as a PC builder sine the late 90's. It was the Drivers. At least in my experience, Nvidia drivers were generally more stable than their peers.
Uhh, NO. My Voodoo worked just fine out of the box while my friends with Nvidia TnT cards were always complaining about constantly patching. To this day Nvidia's rapid patch schedule is annoying, because every time I patch I need to reset my resolution scaling. I just stopped patching as a result.
@@cpt_bill366 Your going way back there brother, when we had to mess with IRQ's and such. I think my first was a GeForce 6600 on AGP. Never messed with ISA cards so can't speak to them. God IRQ's were a pain in the arse, along with 9600 baud modems.
I worked as Director of Development at an SGI major integrator and we later added Sun Microsystems and I become CTO of a Telecom Manufacturer who pattered with one that hand licensed equipment manufactured in Taiwan.
17:05 - 2008 - The company I worked for had a choice between ATI and nVidia for a general-purpose computing solution using graphics processors for a defense platform. Our team spoke ATI as a first language, which was hieroglyphics, the customer wanted performance and nVidia but couldn't get both. We settled on ATI. That product lasted one generation. CUDA led the way. And the customer did not need our specialized hieroglyphics programmers...
I owned every one of these early graphics cards and monitored Carmack’s .plan file eagerly in these years. The timeline is not entirely clear and some events and declarations end up seeming intermingled, albeit not intentionally.
It’s a shame how hard it is presently to get a hold of the new 3000 series gpus from nvidia. Really hope this changes, for a long time I built my pcs with amd but with the new gpus they came out with it makes gaming more affordable
This is such great stuff! I’m completely gobsmacked by the anecdote about how 3dFX burned its customers due to an ill-conceived business idea - and then lost the entire game! 🤯
Nice video and It was incredible to see that little graphic card that Intel made in the '90s that went nowhere....same with Arc now. I don't understand why the N1 CPU maker in the world can't just make some competitive good-performing GPUs. They have experience designing chips and financial resources.
I remember when I first had a go with Linux, it was Debian. I couldn't figure out how to install drivers. Right up until the point you realise they just get detected and loaded as kernel modules at boot or, at most, pulled in with an apt-get command. Certainly was an eye opening moment in my journey as a software engineer and architect.
Interesting video but I am confused by your request we sign up to your channel for videos about Asia. What exactly do graphics cards have to do with Asia?
Somehow Nvidia got the devs of machine learning frameworks to use their proprietary CUDA API instead of OpenCL or GLSL. This made a huge impact making their GPUs the only ones capable of AI. It would be interesting to have a video of why/how all developers chose CUDA. It still seems a strange choice locking them to a vendor.
I'm not AI guy, but... GLSL is OpenGL's hack for opening up the graphics pipeline to software - a c-like language to process vertex and pixel data. While it can be used to process data, it needs two or at least another layer API above it, because it presents entirely the wrong abstractions for general computing. And the language compilers aren't anything to write home about either. OpenCL is that API layer above GLSL, and it's not a great API from what I've read. So CUDA being more popular is not at all surprising. Vulkan seem to be alot better though - it presents GPU and CPU cores (or any other type of processors) the same way, with the same memory model - this was the whole AMD homogenious compute idea and they've been executing it on the hardware level for more than a decade, now the software is there to provide a decent programming model for it. TLDR, There's no use making brilliant hardware if nobody can write software for it, what ABI or API, and sofftware tools you present to a programmer matters a lot.
@@PaulSpades CUDA wasn't so good at the beginning - just a proprietary way to write shaders. But I agree that AMD and the rest (Intel?) neglected what impact would have this few years later.
@@William-Morey-Baker I should also add that that kind of "gotcha" thinking can be very toxic and is often used to reinforce anti-intellectualism. Be careful to avoid bias.
@@javaguru7141 i kinda had the same thoughts as william morey-baker. this wasn't just a slip of the tongue - this is basics; and if he can make mistake in this, it makes me question his knowledge about computers. i wonder if he's just someone simply reading a script, without understanding of what he's reading, which i've seen happen too many times in other youtube videos. p.s. this is a very simple point that william is making. big phrases like "toxic thinking" or "reinforce anti-intellectualism" are completely out of place here. sounds more like you saw this whole sentence somewhere else and decided to paste it here to appear smart.
@@epposh *sigh*... I won't bother opening a conversation about the latter. But regarding the former, I watched the video again and I'm pretty convinced it really was just that - a slip of the tongue. NVIDIA really did start pushing GPGPU around that time. The acronym for that is General-Purpose GPU. You could easily backronym that as "General-Purpose Unit". I wouldn't be surprised if someone at NVIDIA actually uttered those words at some point. The important information he was trying to convey was accurate.
Great document about GPU history! I'm old enough to have actually bought a Voodoo graphics card with my own money as a new. I feel that the drivers of NVIDIA cards are still the most important part of the technology. It appears that AMD (ex ATI) hardware may actually have more computing power but drivers fail to surface it to the applications. In addition, OpenCL took too many years to create and NVIDIA proprietary CUDA took the market and many apps (including open source Blender 3D) have proper hardware support for CUDA only, even today. AMD is supposed to have something cooking for Blender 3D version 3.5 but the actual performance remains to be seen.
I was running a computer shop between 1997 and 2000, and I saw brands come and go. I sold a lot of 3dfx voodoo1, voodoo2 and banshee. But in 2000, Riva 128 and Riva TNT were arriving and changing the face of the market . The king's choice of NVidia was to handle the drivers by themselves. Even with ATI there's always a wheelbarrow full of compatibility issues, even today
Recall getting these graphic boards as hand me downs from our stock trade floor. Traders were upgrading their desktops in the 1990s at will so why let last years model go to E-waste. PS - are you going to do a part 2 on impact of digital currency on the graphics market?
Hope you enjoyed the video. Like and subscribe, etc. etc.
For other company profiles, check out the playlist: th-cam.com/play/PLKtxx9TnH76Qod2z94xcDNV95_ItzIM-S.html
Excellent as always, thank you!
Wait, why was "Going Fabless" written when switching from SGT to TSMC? The SGT deal was also fabless because Nvidia didn't own any fabs then either
So good.
Excellent review of the history. seeing the Doom and Quake reminded me of the simple days of getting doom running and just wondering what that noise was behind me. :)
How did you comment before the video started
My dad worked at Zeng Labs starting in 1996 before it was bought by ATI. Stayed on until Ati was bought by AMD, and stayed on at AMD during the GPU wars. He left AMD in 2005 or 2006. His team built the first Radeon chip.
You man Tseng Labs?
Lies !
I had a veretrie 2200 powered by tseng labs. It was a dissapointment
@@truthsocialmedia I think you might be confusing something. V2200 is entirely Tseng-free. And Tseng never finished its 3d chip, at least not before going belly up.
But it was indeed fairly useless.
So?
One correction:
NVIDIA did coin the term GPU in 1999.
GPU is short for Graphics Processing Unit (not General Processing Unit).
GPGPU is the term for General Purpose computing on GPUs. (as far as I know it was not coined by NVIDIA though)
General purpose computing on GPUs started to become more common after programmable shaders were introduced in 2001, with the NV20.
Great video. Loved hearing again about the early days of NVIDIA. There's a lot more to the story for sure, but this hit all the right notes, thank you!
I've worked for NVIDIA from 1999 through today. I lived through a good portion of this, it was (and is) exciting.
What was your university degree in?
@@mingdianli7802 I dropped out of high school when I was 13
I thought it was 3dfx
Can you get me a GPU?
Maybe you could collaborate with asianometry , a new concept video, interview video
I used to work in the video card industry in the valley. I helped design video cards and all chip companies like Nvidia, Tseng labs, Rendition and 3D labs would come and show us their latest wares. We actually made a 3DFX Voodoo card that sold very well. At one point I put together a video card that was both PCI and AGP on a single card. You would just flip the card over and change the bracket to use the card with the other bus. Wow such memories.
The 1st ides to USB-c and Apple charger.
You mention the IBM PGA, but your presentation seems to ignore other 2D graphics adapters (with and without acceleration) that rose to prominence with Windows before 3D graphics. Also there were actually a number of graphics chip design houses that were already on the scene creating chips with 3D acceleration (or deceleration depending on who you ask) when nVidia and 3dfx arrived on the scene. Admittedly none were particularly good by comparison.
Also every last IHV on the market was writing device drivers for their graphics accelerator chips. The system OEMs didn't write squat. OEMs contracted out all the productization and customization to board manufacturers which hired their own device driver development teams for that purpose. At least that was true in the '90s.
Admittedly nVidia always had a substantial device driver development team practically from the get-go. But that was actually for self-serving reasons that probably weren't apparent to the public. The early nVidia designs had something unusual - it's own integrated command execution pipeline tied to it's own DMA channels. However all of the less common commands in the pipeline were actually virtualized and farmed back out to the host CPU for simulation. To accomplish this virtualization, nVidia needed a much larger driver team and that team needed to be more involved with the silicon design team. That's actually what initially drove their vertical integration - they just couldn't rely on board manufacturers to address issues in this chip virtualization system - though at first they tried.
Also about the demise of 3dfx: Before 3dfx approached it, STB Systems was probably the largest or second largest contract graphics board manufacturer in the world. All the major OEMs bought cards from STB Systems. But unlike companies like Diamond and Matrox, STB Systems did not sell self-branded retail products to the public pre-merger. Instead its business model was to take sample silicon from graphics chip makers, spin a board and customize the driver with it's own library of graphics optimizations. (The optimizations were the secret sauce it used to sell to OEMs because they impacted benchmark numbers.) It would then offer these sample boards up the OEMs and each OEM would order SKUs, usually from the two best performing chipsets. This model kept STB System's Mexico PCB factory line near 100% capacity for several years.
Before 3dfx made it's offer, STB Systems had seen huge success with both the nVidia Riva 128 and the TNT. At the time of the merger announcement about 90% of STB Systems' sales were nVidia TNT-based boards and every major system OEM was buying them. Post-merger announcement nVidia obviously refused to offer any new chip designs to STB Systems. What's worse 3dfx had never been forced to meet an OEM development cycle and even if it had, their new Banshee was at best competing with the nVidia TNT (and not even beating that) and not the current generation silicon.
When 3dfx and STB Systems merged they were flush with something like $100M in cash. However STB Systems had done a capital lease financing arrangement on it's Mexico production line and those multi-million dollar lease payments had to be made each month whether the production lines were producing products 3dfx/STB could sell or not. It didn't take very long before the fabs in Mexico were idle and the company was staring bankruptcy in the face, because the few wholly 3dfx-based boards they produced sold only a tiny fraction of what the Mexico fabs could spit out. Also STB Systems had just built a new headquarters that it had to pay for and all those engineers STB and 3dfx had on staff didn't work for free.
So it wasn't too long after the merger that they went looking for suitors. nVidia cut them a deal to buy out their intellectual property and hire their device driver and firmware developers. The remains of 3dfx entered bankruptcy and the support staff were shown the door.
Regards,
Joel Corley,
Windows Device Driver Developer,
Recently Retired from Microsoft,
Formerly a Developer for STB Systems and 3dfx...
RIP who read this all along
@@dbcooper. it is a nice read
@@dbcooper. It's a very interesting read.
😱 that was one hell of a long comment
Thank you for your post, it's a very interesting read.
My uncle used to work for SGI back in the early 90s and he said that at some stage he remembers some people pitching the idea of a consumer graphics card and they basically got laughed out of the room.
He was also telling me they basically invented video on demand but at the time the bandwidths available were too low to be practical and the project was eventually abandoned.
I can confirm the first point. It happened during the worldwide sales conference around 1992 or 1993. Audience of about 500 sales and engineers.
Top management dismissed the idea.
Some engineers mumbled their disagreement in the back of the room.
The rest is history...
Regarding VoD, that is right too.
But it was not abandoned, it was spun off into Kasenna and had many commercial successes, including at Tier 1 operators.
Probably laughing out loud while filing for the patent.
@@TheUtuber999 Not quite but a few SGI engineers were listening and agreed, those are the guys left shortly after and founded 3dfx, using their experience to design that first Voodoo chip.
B2B and B2C distribution are very different beasts. SGI may well have been capable of solving the technical challenge but then there is the go-to-market strategy that has to match the customer. Jon gave us right in this video the example of how you can fail with 3dfx trying to go from B2B (selling chips to gfx card makers) to B2C (selling finished cards to consumers). Fail you will if not prepared for differences in sales and marketing approach.
From my experience in the industry it was how Nvidia released driver updates for their cards. Sometimes 5 years after they stopped selling them which was practically forever compared to the other players who released graphics cards with buggy drivers and then never fixed them and stopped releasing updates after 1 year. In the early days of windows a graphics driver problem would result in a BSD and loss of work and no real clue what the actual problem was with the PC. I saved the company I worked for thousands in warranty repairs by only using reliable and supported graphics cards and ditching all the other brands. Just about all the other PC components if there was a fault would clearly show themselves. Graphics card problems cost a fortune to sort out, customers bringing back machines that worked all day in the workshop but would BSD when the client was doing some unusual operation, I would return the cards to the OEM, they would test them and send them back. A nightmare of epic proportions.
Such a trip down memory lane ! I remember living through it all, but of course back then we didn't have all this information about how it all came to pass. Thanks, man, I enjoyed that.
I had Nvidia's Riva 128 in my first computer, but immediately bought Voodoo 2, because Riva was such a trash. It only sell as OEM graphics card. I don't even know if any game actually worked well with it. Not that it was NVidia's fault because DirectX was such a bad joke at that time:
"Plug and play" -> "Plug and pray"
"3D accelerator" -> "3D decelerator"
At least DirectX had build-in software rendering support, so you could play games with it as the last resolt.
@@playnochat indeed i also remember Riva as quite a trash in fact, I even had to play Fifa 99 on Software mode! Why did I pay for that card??
What a ride it was, man!
There's clearly a follow up to this as NVidia GPUs begin to get used in things like neural networks, crypto mining and cutting edge supercomputers... You could argue that without video gaming we wouldn't have had 'the third wave' of AI we now see is so transformative.
Agreed. The never ending quest by gamers for realism (physics, compute shaders, etc) helped pay for the R&D that later opened up all those other needs.
There's also a follow up to this as Nvidia is now not so friendly with its board partners (AIBs) anymore. You just had one drop out who was a bigger name in N. America, EVGA.
They're clearly trying to move to where they manufacture their own products and they don't have any board partners in the future.
The latest boondoggle almost shows the lack of concern between Nvidia and AIBs, which is the move to a high power 12V power connection from the PSU to the GPU. Nvidia I'm sure did a lot of research with this connection and for their branded boards the way they're mounted is different than probably all the AIBs are, and now AIB GPUs are creating a fire hazard for the brand new RTX 4090. Meanwhile Nvidia GPUs don't have a problem. Nvidia is probably the company that wanted the AIBs to move to this high power connector (AMD doesn't use it) because Nvidia developed the standard for it, but most likely failed to give the AIBs ALL the research data that went along with the development of this connector.
Nvidia going for General Purpose GPU computing with CUDA was a genius move. As a GPU computing expert doing physics simulations, I say it is now a significant performance and cost benefit in high performance computing sphere.
Didn't AMD try to do the same thing with OpenCL ? I'm a total noob compared to you, but I remember OpenCL being praised for being open-source, unlike CUDA. Did something go wrong ? Or is AMD competitive in that area ?
@@minespeed2009 I've recently started learning deep-learning and yeah, I'm a bit scared of being trapped in the Nvidia ecosystem the way I got trapped into the Microsoft ecosystem.
@@TheNefastor Just wait for the XILINX acquisition to be complete. OpenCL and FPGA processing will compete favorably against CUDA. ALSO cuda uses an api-oriented architecture while OpenCL is written in machine code and executed directly.
So far NVDIA ecosystem has gained traction for sure. BUT AMD is on a roll and wound't be surprised if they somehow developed new synergies with their x86+GPU+FPGA ecosystem all under a single roof.
@@franciscoanconia2334 that would be great since I speak VHDL too 😄
how does it compare to openCL? why did you choose to use cuda instead of openCL?
This reminds me of another game changer back in the day: Sound Blaster. Would be nice to have a recap of the rise and fall of Creative.
AWE32... The first time I played with a true MIDI synthesizer. It blew our socks off back then. I remember playing too much Rise of the Triad just because the music was so good.
Excellent suggestion given the legacy of Creative Technologies. A technology hardware company founded in Singapore and sufficiently 'global' to achieve a NASDAQ listing. This was truly rare 20 years ago.
Agreed! I still have an Audigy2 card - which was (and probably still is) superb - but no good driver support for the latest OS's. It did things that no sound driver/card I've seen in ten years has been able to do. It seems Realtek do things /just/ well enough that it's not worth shelling out for anything better unless you are a real enthusiast.
@@TheNefastor I was a little kid back then and everyone would crowd in front of the family computer to marvel at the interactive game play with sound effects better than the arcade. Somehow I wish I could mod my PC and play those games again.
@@daviidgray1681 Yep. Very rare indeed. I had a glimpse of the founder himself when he was a humble computer repairman in a rickety old shopping mall. A hardworking and honest man that repaired my brother’s computer. Little did we knew he would go on to much bigger things.
Also important was how Nvidia were able to recruit whole teams from SGI when Microsoft announced they were doing Fahrenheit with SGI - this was in the days when Microsoft also bought SoftImage and wanted to take over the workstation space with Win NT, causing SGI difficulties apparent to all their staff. It wasn't even seen as a big deal for SGI staff to defect. So they did, taking their knowledge of the fundamentals with them.
Now I understand why Windows NT was also built for the MIPS architecture, the one used by SGI.
Nvidia has always played dirty.
I hope this video makes you a lot of money over the years. Simply one of the best lectures about the history of graphics cards on the net.
22:00 pretty sure they mean Graphics Processing Unit, not general processing unit.
great video. Personally, I would've liked a section that goes more indepth into the recent years, as well as the ARM deal. Also, I'd love a video similar to this one for AMD, as well as one on the merger between AMD and Xilinx. Hopefully some of these topics will make it into future videos!
3:31 SGI was so far ahead of everyone at this point they could have easily been the size of Nvidia today. SGI created the graphical hardware for the N64, and the N64 dev kits where SGI work stations. What a incredible blunder they made not getting into the consumer space.
I remember the graphics card wars.
I remember when graphics cards cost less than a complete desktop PC set.
Superb video with great explanations! Keep these videos coming!
Insanely interesting, having grown up using computers in the whole period covered by this video, it's super cool to now understand all these brands, technologies, terminologies, strategies and decision making by the companies. Thank you so much for the information - I thoroughly enjoyed it. Please keep up the good work.
Didnt some of the top engineers at Nvidia come from Silicon Graphics Inc? I think its fascinating how people moved around from various companies. From the 70's-90's I think fewer than 100 people really shaped the whole home computer market. Probably less than 50 in the chip and motherboard design space.
Your videos are all very well researched and very interesting. Thanks to you and your team.
Oh dude! I remember those old Vodoos they had a cable in the back of the pc that went from the video card into the 3d card. My first was the Banshee, 16mb of power! It was awesome at the time.
Actually, Castle Wolfenstein was the first 1st person shooter. Its wide distribution in the mailbox scene prepared for Doom's success.
One could argue that Ultima Underworld was the first one as it was released two months before Wolfenstein.
3Dfx, I haven't heard that that company name in a long long time. The first graphics card I ever bought was a 3Dfx. I remember buying it like it was yesterday. I hadn't played games on a computer since the Commodore days in the 80s, but I had very fond memories of playing games all weekend and summer with the other boys from neighborhood so when I was finally on my own I went and bought a PC and a game. I took it home and tried to play it. After getting practically zero framerate I read the box and realized I needed a GPU. I knew what they were, but had no clue how to shop for one so I started researching and over 20 years later I still haven't stopped. That 3Dfx card was what got me interested into how PCs work and ultimately had me going back to school to take as many classes as I could on the subject
I love this video, it's like what I lived through in the 90s trying to build and constantly improve my fankenstein machines. All the churn with the video cards brings me down memory lane (3Dfx, Intel 740i, D3D, OpenGL AGP vs PCI, Real3D, Voodoo 1, 2, Banshee, etc). Thanks!
damn, i have never once made the connection between Nvidia and the Latin Invidia. this is the second time this has happened to me. first time being the connection between Asus and Pegasus. so much for the creativity part of my brain.
If you put it this way I think Nvidia with their name are trying to say that every one else that are not them are Jealous meaning Envidia in Spanish.
I have read another theory saying the name came from the similarity to the word "video"
Lol I feel the same way
Odd Matrox didn't get a mention. The Matrox Millenium was a top selling card for a while. Intel's MMX extensions was also a resource Quake utilized that helped Nvidia piggyback. But otherwise this doco was very on point.
Matrox was only able to compete for a while because the 2D performance of their chips and DAC converters for VGA was one of the best out there, especially at higher resolutions. But when more and more games shifted to 3D and DVI became a thing, they were left in the dust.
Quake does not use MMX.
How about S3?
@@andycristea agree
I had a Matrox card specifically for their good 2D performance (at the time I didn't care about 3D games).
I love your channel! I'm a mechanical engineer who switched from heavy machinery to semiconductor fab equipment engineering (CMP). Your videos have been so helpful to me learning how all this crazy wafer stuff works.
Great video. I wish you'd do one on Matrox as well.
Just one small addition: When Intel pushed onboard graphics, where the graphics memory was part of the main memory of the CPU, it was thought that the video solution would actually be faster, since the CPU would have direct access to the frame buffer, as well as having all of the resources there to access it (cache, DMA, memory management, etc). The reason they lost that advantage in the long run was the dual advantages of VRAM or dual ported video ram, a ram that could both be read and written by the CPU at the same time as being serially read out to scan the video raster device, as well as the rise of the GPU, meaning that most of the low level video memory access was handled by a GPU on the video card that did the grunt work of drawing bits to the video ram. Thus Intel ran instead down the onboard video rabbit hole. Not only didn't they win the speed race with external video cards, but people began to notice that the onboard video solutions were sucking considerable CPU resources away from compute tasks. Thus the writing was on the wall. Later, gamers only knew the onboard video as that thing they had to flip a motherboard switch to disable when putting a graphics card in, and nowadays not even that. Its automatic.
Shared CPU-GPU memory still is a good idea, sadly the graphics APIs didn't take advantage of it one bit and I think Intel's graphics drivers were always a bit crap. As such, only game consoles and some mobile devices properly implemented a memory pool with both cpu and gpu access. The Vulkan api seems to treat GPU and CPU cores symetrically, and shared memory on the ZEN platform should finally fix this on mainstream pcs.
At this point, modern CPU cores support simd with AVX extensions, I don't know if there's any any point having separate graphics oriented APIs and architectures for gpus... the architectures seem to have merged towards eachother(GPUs getting more general instruction support and CPUs getting more parallel data instructions).
3Dfx was a typical cash burning startup in the Dotcom era of the late 90s. The captured the PC gaming market in the right time with simple and working solution, but they always had struggled with evolving forward, with unusually long R&D cycles, while relying too much on recycling their old and proven tech. But the alienating of the OEM partners would be their biggest undoing from a long string of missteps... and they had no chance to compete with Nvidia's 6-month release cadence.
The greatest move NVIDIA did was acquiring the marketing genius Brandon “Atrioc” "glizzy hands" Ewing. He pitched and developed the iconic Greenout campaign. He also pitched "Frames Win Games" Esports campaign, which is currently the flagship Esports campaign for NVIDIA and its hardware partners. Truly the GOAT of marketing
aGREE
A followup with a GPU wars will be awesome.
This was one of those videos that I've really enjoyed watching and learned something new... Good job SIR!
Thank you for covering Nivea. Love their lavender infused moisturizing cream.
LOL. Nivea is pissing off a lot of people. They are going to stop producing their 30 series video chips this month. At least Intel is going to produce new video cards. There seems to be a never-ending insatiable appetite for video cards. I'm wondering when this shortage will end, if ever. They're saying the shortage will remain for 18 more months. At this point, I don't think people will care who makes the card or what the performance is as long as they get SOMETHING which is stable.
@@Cypherdude1 Nvidia isn't going to stop producing 30 series cards this month
@@Cypherdude1 Look again at the spelling of the above comment. Nivea makes cosmetics.
😂
@@mattmexor2882 This makes my purchase of buying a an already overpriced 3080ti worth it. They are doing this to keep prices high I hear.
This is an excellent channel. I’m definitely becoming a Patreon supporter this content is so well curated. it’s just technically excellent
The Cromemco "Dazzler" was the first graphics card in 1976. (and you're about 10 years too late for the graphics revolution - the Amiga was doing all of this in 1985 as well as the GUI. The screenshot of GLQuake you show is actually the software renderer - it doesn't use bilinear filtering. If you go to the place you sourced the image you'll see you took the wrong screenshot - this is a software shot next to the GL shot for comparison)
Jeez - I remember all these graphic cards revolutions in the 90s. ATI, RIVA TNT, VOODOO 3Dfx. It was the time when the AGP standard was introduced before being replaced by PCI Express.
- Excellent.
- Thx for all your effort, and talent.
- I got into computer graphics in the early 90s, so, I experienced all the history you covered, but did not know any of the back story... until now. Thx, again.
Don't forget that Nvidia never stopped selling their GPUs to third party companies, so today there's a myriad of brands using Nvidia chips from where to chose.
great video! 👍🏻 could you please make a video about matrox, 3dfx, number9 , thank you
Nvidia did a lot of dirty work to win the market domination~ the CEO is cunning. That's what makes a successful company.
Yeah most companies have been stabbed by Nvidia but they have so much money that they don’t care.
I still remember their Voodoo Graphics Card
18:37 I remember STB. They made a good Tseng Labs ET4000/W32 board that ran well with OS/2.
Now this is the video I was looking for! Good stuff dude 😎
I was young during the days of games like doom2, quake 1,2, Red Alert etc and I remember being able to run these games perfectly on our 486 machine without any discrete graphics installed. I know my dad and he hated the idea of paying extra for discrete cards so I'm sure there wasn't one in our system.
We did have sound blaster tho and the games looked and sounded awesome!!
Great video and I loved knowing the history behind the giant. If I've read correctly, a GPU is a Graphics Processing Unit and not a General Processing Unit as you mention it towards the end of the video - en.wikipedia.org/wiki/Graphics_processing_unit. The rise of AI and ML sciences made these chips popular because it is faster in orders of magnitude than a regular CPU with more or less the same power consumption. Nvidia is leveraging its know-how in developing graphics chips to now capture this market.
7:25
A video on Malta and the fabs it has managed to lure in might be interesting.
Is that Malta the country, or Malta in NY?
@@RandomByte89 the country of course
@@RandomByte89 If Malta in NY, THEN Global Foundries
Great show thanks Asianometry,
My whole gaming world improved when I installed the Voodoo card, it was as different as night and day.
0:15 dang. Nvidia is now #11 next to TSMC at 545B just right now. i dont see any reason except the rumors that Nvidia are slowing down production to maintain high GPU prices
it's a BS rumour that makes no sense lol, as nvidia sells the GPUs at the same price regardless of final price to consumers. Nvidia's stock price has everything to do with their dominance in data center and AI, very little to do with gaming sales even though these represent a good chunk of their revenue still.
@@elon6131 while i did not watch that rumor, Best Buy increased their prices on Nvidia cards.
that is shocking new info. nvidia received flak from some of their investors when crypto crashed in 2018 and crashing the stock, arguing that Nvidia downplayed their revenue from graphics cards/mining in 2018.
@@zodiacfml and then nvidia got mad at them and they had to roll back. Though FE editions are the only ones nvidia directly sells, final retail price is not in their hand nor do they benefit from any markup the retailer puts on the product. The best buy increase was a best buy move (consistent with their pricing on the other cards, until nvidia had them roll it back). nvidia's FE msrp has not changed.
i am aware they got flak after the crypto downturn, though it was most certainly not malicious as nvidia cannot know who buys their cards once they are in retailers hands (and lying on reports would be very, very illegal. you do not mess with the SEC).
@@elon6131 good one.👍
crypto mining
Very informative! What a cool piece of history the GPU wars were! I remember playing Morrowind on the GeForce MX420 card back in 2002. I remember being dazzled by the water shaders. Give it 20 years from today and everything will be photorealistic. We're in for wild times coming into middle age.
So nostalgic! All these brand names take me back to my childhood and asking Santa for Voodoo cards MMX processors and AGP slots...
this took a lot of effort and it turned out great! well done
Man, I just love your voice and calm explanation style.
1:15 - in 1991 triangle-database graphics engines where $M machines that needed a $M computer to run their databases and were used to provide displays for simulation flight trainers. I worked for a company that provided the simulation computer that requested a set of displays at 15/20/30/60Hz. Five years later those graphics machines were obsolete...
GPU always stood for Graphics Processing Unit; GPGPU is the term for General Purpose Graphics Processing Unit
Very informative. In 1986 I joined a company in British Columbia, Gemini Technology, that had great ambitions to dominate the graphics card market. We designed an EGA chipset using Daisy workstations. I taped out the design at LSI Logic in Sept., 1987, a few months after IBM introduced the VGA. Earlier that year we had spent a month in Japan, at Seiko Epson's design center. I don't know why that effort failed, since their was little transparency there. After completing the design at LSI, I moved to Video-7, an up and coming graphics card company that had just completed a VGA chipset, and was selling graphics boards. My LSI Logic EGA chip, surprisingly, found some customers. At V-7 we examined the IBM PGA, trying to reverse engineer the chipset and I also helped integrate my previous design with a V-7 product. Gemini eventually failed and was absorbed by Seiko Epson. Video 7 merged with an LSI Logic division, but had some legal problems that required the CEO to wear an ankle bracelet. I continued designing custom chips and even ran into my former Gemini Tech. co-workers at a Silicon Valley design center. Most of all I enjoyed the time I spent in Japan.
Cheers for the vid. Concise and accurate. Good work
Excellent overview and history! Thanks, well researched.
Great video as always! Keep in mind that without ASML nothing would be possible. Thanks!
The picture of the motherboard at 20:24, does it actually have on board graphics, or is it what I find to appear as, an 'APU' as AMD referred to them...?
That be an ASRock A790GX with an AMD AM2 CPU socket, so no APU support. It has a Radeon HD 3300 integrated into the board chipset.
@ 15:15 Morris Chang!! Not Zhang!!
For those interested: The IBM Professional Graphics Controller is called 'PGA' because the A can stand for either Array or Adapter. In modern English the letters 'A' and 'C' are commonly used interchangeably, much like the letters 'ß' and '§'. This can often be confusing for those who speak British English because IBM chose to use the uncommon 'Controller', instead of the classic spelling 'Aontroller'.
I worked at one of those 3rd party board companies in the mid nineties and so far this is the only presentation that gets it right. Great job.
I was even offered a job at Nvidia who directly recruited me in their drive to vertically integrate graphics drivers. I am sad I said no.
But I could tell that Nvidia was going to win it all, on my visits, they shared with me their secret inner sanctum where there computers were situated running that vaunted design simulation. Their record of getting chips out on first turn was unmatched. It's almost like your observation about TSMC - they would win with cycle times.
What an awesome channel! Liked, commented, subbed! Thank you!
I love your nVidia videos, thank you 🙏 🖤💚
Excellent video. One thing you missed out was Matrox. I believe they were the 4th market player back in the early 2000s.
Though Nvidia GPUs usually cost more than it's rival, stability is something I always look out for which both Nvidia and Intel provides it. The red team has time and time again disappointed me in its quality. The introduction of Radeon Vega APU though change the whole gameplay for budget PCs again. I hope Intel's newest foray into the GPU space will bring a new level of benefits for consumers.
Here are some of the Nvidia GPUs that I have ever owned and used includes:
- Riva TNT
- GeForce 2 MX
- GeForce 6600 GT
- GeForce GTX 660 Ti
- GeForce GTX 770
- GeForce GTX 1060
Unfortunately bloody crypto miners are ruin the GPU prices for gamers now.
Great video! There's also the issue where 3dfx kept suing nVidia who just patiently bore the brunt of the lawsuits until when they finally bought them. Its a good business lesson.. don't worry about doing things that your competitor might sue you for if you can plan to just eventually buy them out.
Good to acknowledge the legend Mr Carmack. Blessings from the yoruba-Oduduwa.
This is an exceptionally objective and accurate description of the circumstance at the time. I was there, man! 😂
Of course I loved 3dfx. Doom sold more hardware than anything!
What a great video, I’ve just arrived in Taipei and hopefully after government quarantine I’ll get out to do some tech related sight seeing. If any one has any recommendations please let me know.
Very well done overview.
I can only speak to my experience as a PC builder sine the late 90's.
It was the Drivers.
At least in my experience, Nvidia drivers were generally more stable than their peers.
Uhh, NO. My Voodoo worked just fine out of the box while my friends with Nvidia TnT cards were always complaining about constantly patching. To this day Nvidia's rapid patch schedule is annoying, because every time I patch I need to reset my resolution scaling. I just stopped patching as a result.
@@cpt_bill366 Your going way back there brother, when we had to mess with IRQ's and such. I think my first was a GeForce 6600 on AGP. Never messed with ISA cards so can't speak to them. God IRQ's were a pain in the arse, along with 9600 baud modems.
Love these videos! Thanks!
I worked as Director of Development at an SGI major integrator and we later added Sun Microsystems and I become CTO of a Telecom Manufacturer who pattered with one that hand licensed equipment manufactured in Taiwan.
17:05 - 2008 - The company I worked for had a choice between ATI and nVidia for a general-purpose computing solution using graphics processors for a defense platform. Our team spoke ATI as a first language, which was hieroglyphics, the customer wanted performance and nVidia but couldn't get both. We settled on ATI. That product lasted one generation. CUDA led the way. And the customer did not need our specialized hieroglyphics programmers...
I owned every one of these early graphics cards and monitored Carmack’s .plan file eagerly in these years. The timeline is not entirely clear and some events and declarations end up seeming intermingled, albeit not intentionally.
Why font consoles use nividia chips?
It’s a shame how hard it is presently to get a hold of the new 3000 series gpus from nvidia. Really hope this changes, for a long time I built my pcs with amd but with the new gpus they came out with it makes gaming more affordable
Just wait for the crypto craze to die (in pain) and you'll be good 🙂
This is such great stuff! I’m completely gobsmacked by the anecdote about how 3dFX burned its customers due to an ill-conceived business idea - and then lost the entire game! 🤯
20:42 that Unreal Tournament screen shot is some good nostalgia :D
Québec's Matrox, the Parhelia, the origin of the ring bus. Canadian Ati picked it up. Matrox is still alive today.
Parhelia, aka too little, too late, too expensive.
What's the situation of Matrox today?
We bought the latest Martox Hardware to get the upper hand in Decent II. Fun times in Online Gaming.
Could you do a video on Windows accelerator cards please?
Nice video and It was incredible to see that little graphic card that Intel made in the '90s that went nowhere....same with Arc now. I don't understand why the N1 CPU maker in the world can't just make some competitive good-performing GPUs. They have experience designing chips and financial resources.
i wish you had another 10 minutes on this video that explained to date graphics card market shifts.
Wait, does “GPU” not stand for “graphics processing unit”?
More info on the detailed architecture of these systems would be interesting.
Excellent historical summary!
I remember when I first had a go with Linux, it was Debian.
I couldn't figure out how to install drivers.
Right up until the point you realise they just get detected and loaded as kernel modules at boot or, at most, pulled in with an apt-get command.
Certainly was an eye opening moment in my journey as a software engineer and architect.
Interesting video but I am confused by your request we sign up to your channel for videos about Asia. What exactly do graphics cards have to do with Asia?
What about the fine OpenGL PC workstation class video card manufacturer 3D Labs ??
Somehow Nvidia got the devs of machine learning frameworks to use their proprietary CUDA API instead of OpenCL or GLSL. This made a huge impact making their GPUs the only ones capable of AI. It would be interesting to have a video of why/how all developers chose CUDA. It still seems a strange choice locking them to a vendor.
I'm not AI guy, but... GLSL is OpenGL's hack for opening up the graphics pipeline to software - a c-like language to process vertex and pixel data. While it can be used to process data, it needs two or at least another layer API above it, because it presents entirely the wrong abstractions for general computing. And the language compilers aren't anything to write home about either. OpenCL is that API layer above GLSL, and it's not a great API from what I've read. So CUDA being more popular is not at all surprising.
Vulkan seem to be alot better though - it presents GPU and CPU cores (or any other type of processors) the same way, with the same memory model - this was the whole AMD homogenious compute idea and they've been executing it on the hardware level for more than a decade, now the software is there to provide a decent programming model for it.
TLDR, There's no use making brilliant hardware if nobody can write software for it, what ABI or API, and sofftware tools you present to a programmer matters a lot.
@@PaulSpades CUDA wasn't so good at the beginning - just a proprietary way to write shaders. But I agree that AMD and the rest (Intel?) neglected what impact would have this few years later.
i've always thought that GPU stood for graphics processing unit... perhaps you mean to say GPGPU?
it does mean graphis processing unit... he was wrong... at least it was relatively small, but it makes me doubt a lot of the other stuff he says
@@William-Morey-Baker All people make mistakes, even really braindead ones. It doesn't necessarily have any deeper implications.
@@William-Morey-Baker I should also add that that kind of "gotcha" thinking can be very toxic and is often used to reinforce anti-intellectualism. Be careful to avoid bias.
@@javaguru7141 i kinda had the same thoughts as william morey-baker. this wasn't just a slip of the tongue - this is basics; and if he can make mistake in this, it makes me question his knowledge about computers. i wonder if he's just someone simply reading a script, without understanding of what he's reading, which i've seen happen too many times in other youtube videos.
p.s. this is a very simple point that william is making. big phrases like "toxic thinking" or "reinforce anti-intellectualism" are completely out of place here. sounds more like you saw this whole sentence somewhere else and decided to paste it here to appear smart.
@@epposh *sigh*... I won't bother opening a conversation about the latter. But regarding the former, I watched the video again and I'm pretty convinced it really was just that - a slip of the tongue. NVIDIA really did start pushing GPGPU around that time. The acronym for that is General-Purpose GPU. You could easily backronym that as "General-Purpose Unit". I wouldn't be surprised if someone at NVIDIA actually uttered those words at some point. The important information he was trying to convey was accurate.
Very good video explanation
Great document about GPU history! I'm old enough to have actually bought a Voodoo graphics card with my own money as a new.
I feel that the drivers of NVIDIA cards are still the most important part of the technology. It appears that AMD (ex ATI) hardware may actually have more computing power but drivers fail to surface it to the applications.
In addition, OpenCL took too many years to create and NVIDIA proprietary CUDA took the market and many apps (including open source Blender 3D) have proper hardware support for CUDA only, even today. AMD is supposed to have something cooking for Blender 3D version 3.5 but the actual performance remains to be seen.
I wish you add turkish subtitles. Good work.
one feedback i would like to add to an otherwise great content is to increase the microphone volume output
I was running a computer shop between 1997 and 2000, and I saw brands come and go. I sold a lot of 3dfx voodoo1, voodoo2 and banshee. But in 2000, Riva 128 and Riva TNT were arriving and changing the face of the market . The king's choice of NVidia was to handle the drivers by themselves. Even with ATI there's always a wheelbarrow full of compatibility issues, even today
Recall getting these graphic boards as hand me downs from our stock trade floor. Traders were upgrading their desktops in the 1990s at will so why let last years model go to E-waste.
PS - are you going to do a part 2 on impact of digital currency on the graphics market?
I would love to hear you talk more about tech crypto and visual effects.