Yep... Can't believe Nvidia made it such a high price! $2500 is totally insane. You can buy a car for that. Will definitely result in less profit for Nvidia. Graphics cards like the RTX 3090 are like 4x cheaper but can still get you 60+ FPS on almost any game even with high settings. Only the hardcore Nvidia fans will buy this untill the price goes down.
@@CN-mo4uq definitely agree with you there. Wife's pc has a gtx 970 that still runs games like Elden Ring and Borderlands 3 with no issues at respectable framerates and detail settings.
Back in 1999 and 2000, my high school AV class had a very expensive render station donated to it that would take minutes or even hours to render what a GeForce1 could do in real time. That era was amazing to witness.
To be fair, Inflation plays a big chunk into that price, plus there are much more premium components in the more modern GPUs, and finally, they're using absolute bleeding edge tech to make these GPUs. Still probably overpriced though.
@@arnox4554 Yeah our wallets are bleeding too :P The use of premium components is debatable - NVIDIA has had a history of shoddy QA and quality issues with prematurely failing GPUs, 8800? and others. Companies in a modern world are cutting corners - cost cutting and increasing prices. There is more to a graphics card than the premium caps. I've been had by a company called BFG, life time warranty my arse, now that company is bankrupt, no surprise, were they making quality stuff, NO, all hype, none of the cards I purchased from them lasted more than a few months - Never had issues with other brands like EVGA or ASUS, but I know a lot of casual gamers that had premature failures on their 3000 series GPUs though, it's not uncommon - so the argument of paying more for quality is complete and utter bullshit. We should be paying $1k-$1.5k less on the latest cards. and now with Intel desktop ARC GPUs a total flop and epic fail (just like I predicted) , NVIDIA is in a right position to gauge prices even higher.
@@arnox4554 I hate the argument of "new technology". the first gpu nvidia ever made was 250 bucks and it was the newest technology back then. not only was it infinitely harder to make the first model than everything after that. even if they would be using more premium materials by now, all the extra cost is diminished by all the logistics that are now in place to support the tech industry and which were not even thought of 20 years ago. just like people nowadays think games cost more because they are harder to make. when in reality just making a game boy game like pokemon you had to overcome so many limitations, it was much harder than making a shitty cod/fifa clone for the hundredth of time
@@Noqtis As I said. I agree that they're still overpriced, but I do still think that something like the RIVA TNT was definitely more cheaply made than the even the 1080 Tis of today. I mean, just look at that damn cooler alone. Those old ass GPUs only needed those dinky little fans with a little heatsink to cool those chips. (Although with that said, Nvidia's chips are starting to get pretty fucking absurd in terms of TDP, especially with these Lovelace chips if the rumors are anything to go by. I WISH we only needed a tiny fan for today's GPUs.)
Demos are what? Images that took two weeks to render? Layouts that are known ahead of time so they can be rendered easily? Or is it a random image done in real time? Not impressed with demos. Apparently in the last few months some research papers have shown you can do real-time ray tracing on a random scene that actually works quickly, using some new hardware and software, but I doubt it's in production yet.
seems like graphic demo from early stage of GPU(somewhere in that Nvidia as a new company and 3dfx still exist era) all the way to windows Vista/PS3/xbox360 era wich around 2006-2008 were more foccused to the fact GPU can HW Accelerate 3d rendering,real time reflection and lighting, while past 2008 they more foccused to shows that GPU can create Uncaningly close to real life with subsurface scattering on skin and the likes, 2018 seems like GPU market were shift its foccusses to refine 3d pipeline and introduced HW Ray Tracing while 2020 till today seems the GPU market were moved to AI and complicated math
God I love the old tech demos. One of my first memories of crazy graphics was back in the later 90s as a kid and my sister brought me to one of her friend's houses and he showed me a graphical test of liquid physics that was blood red. I was truly amazed as I've never seen that before and it's been stuck in my mind ever since.
I still love my 1080Ti(s). Still very efficient for the time and outside of RT, they still hold their own in 1440P well enough for most games. Really wanted a 3080FE but couldn't get a chance to get one until a couple months before the 40 series launch.
I honestly wanted to cover that card too but I couldn’t find any demos and I wanted to keep this video from being to long, so I mentioned it as the better option when the 2080 was out
My first card was the Gainward GeForce 3 Ti 200 128MB GS. This was also my very first online purchase (through eBay) back in October of 2003. I have kept it to this day, it's a very special piece of personal computer history.
GeForce 3 was bad ass. It was the basis of the original X-Box GPU with the first Halo which pushed Microsoft in to the console market (which was considered nearly impossible at the time vs. Sony, Nintendo, and to some degree Sega.) I'm really really surprised that wasn't mentioned in this video.
What up GeForce 3 Ti200 buddy! I remember when I finally got my hands on one to replace a well past its prime TNT2. It was revolutionary. I feel like those moments in PC hardware are getting few and far between.
I remember that nobody ever managed 60FPS on 4k with the 900 series on demanding games. When the 1070+1080 came out, people finally managed it with a SINGLE GPU. NVIDIA really changed the game with their 1000's lineup. I still own a 1060 and its still kicking.
I’ve still got a “retro” PC with dual 8800 Ultras in SLI. Each GPU has an aftermarket Theraltake air cooler that used to be a relatively popular upgrade back in 2006-2007. The whole build is in a limited edition fatal1ty case that’s made completely out of thick gauge sheet aluminum. Even though I haven’t turned it on in years, I’ll always keep it around!
ATI/AMD has had some really good products. Like 9800 Pro. Nvidia didn't have anything sensible to compete with. I've never been fanboy and always disliked them. I think my stack of card in all my years is about 50/50 on those. _Always_ buy with good performance price ratio.
Looking at the past feels so weird. Especially considering how recent this was. Whatever that guys name was that coined the doubling in processing power… man… I can feel it. This vid made me try to get my 360 to not red ring. And it ran for a couple minutes(thank the heavens) and the technical impressive games felt like the kind of indie game a first timer would make.
Times have changed, expected a 40 second meme,but watched the whole 20 minutes of knowledge of something I never got and never gonna get. All hail to Gamers without GPU🔥
Without a GPU? That would be my first pc, which was a Commodore 64 back in 1984. You used a cassette tape to load the games which took forever. Good times.The first game I played on it was the hunch back of Norte dame. Mesmerising at the time.
AMD's history starts with them buying ATI. There was a lot of drama there. AMD and Nvidia were very close before that. ATI was always jumping ahead of Nvidia in tech. They gambled everything on using the XBox360 to jump ahead of Nvidia with universal shaders. XBox360 even had tessellation. The XBox360 was almost an entire generation ahead of ATI's PC offerings and many PC gamers jumped to console gaming. That bankrupted ATI as they didn't get much from XBox sales, and AMD bought them. NVidia's 8000 series was their first series with universal shaders and when they introduced CUDA. It was their first physX capable card too. Their base architecture remained the same until the 400 series, never fully supported DX10.1. Rather they jumped to DX11 with the 400 series. AMD however stayed on the same architecture and never fully supported DX 11. They pushed conspiracy theories rather than innovating. Nvidia went so far as to enable multi-threaded rendering in games not coded for it in a driver update. Devs weren't coding for it because AMD didn't support it. DX12 was just Mantle an APU designed for AMD hardware, with some name changes. They still had the same slides in the documentation. DX12 was basically Glide. It didn't improve graphics, but slowed down Nvidia cards. Nvidia had to change their cards to work better with an older way of doing things, as they were doing universal pipes, rather than a single hardware scheduler.
The most notable change between the 7 and 8 series was not the video memory at all. It was the change to a unified shader architecture from the previously fixed vertex and rasterisation pipelines. It was an enormous shift and one which AMD were first to market with before Nvidia and was used in the Xbox360 hardware.
it cant be understated at how how big the 8800 gtx was for its time. Crysis was released shortly after its launch, and the card was an absolute monster for anything at the time that was thrown at it. It was legendary and so many people had them.
@@ballstothewall38 Ahh the GeForce 256. I remember my brothers PC had one when he was first getting me into PCs. Back when pulling 60 FPS was akin to 120 nowadays.
My dad bought me a Geforce2 and I used it for years. As a student I didn't really have much cash so I went for budget AMD GPU after that for many years. Eventually I got my first job and bought a GTX 580, then another GTX 580 for SLI and then another GTX 580 for tri SLI. Then I got married and bought a property. With a mortgage on my back I couldn't afford to upgrade for a long time. Until that marriage broke off and we sold/split all our assets. Quit my job, work casual for 3 years before going back into full time work and started earning double of what I used to make. With those additional cash I finally upgraded to RTX 3090 in 2020.
Marriage is a scam man. ALWAYS keep them in the girlfriend category. That marriage contract is nothing more than a will... with the difference being that a regular will gives away your assets when you die, and a marriage contract gives away your hard earned assets when you divorce.
See how life can get great again when you lose 180 pounds of dead weight. Remember kids never get maried, work on yourself and you hobbies and dont look back. Enjoy your 3090 you deserve it, she can serve you for many years.
You can clearly see they weren't really art-directed and often made by companies that didn't specialize in these types of visuals so they came out very quirky and unusual and directed towards certain effects they wanted to showcase.
@@LNCRFT Yes, it was so bad that I had to go to my friends house to play Crysis after school. I remember he had a 7600GTS, and we had to play on low settings. Still, I was jealous.
@@TacticalPhoenixYT I'm aware of this I Was able to get my 1080Ti Msi Seahawk corsair edition to clock 2.1 max 1.987ghz average max ram based on values then went into games that supported it 60 to 100fps just . like . 2080Ti . searching for missing ram chip.. and channels to more so tech the 1080ti could have 2x the ram based on thread to register... relax I give it all to nvidia or intel who ever's parts I use to get more iNPUT I give them the findings.
One thing that's not conveyed in this video is the increase in _resolution_ that each generation was capable of (as in, a card from the next generation being able to run the same demos and games as the previous generation but at a significantly higher resolution). This is quite an important factor, besides the visuals.
And I'm here still using a 1440x900 monitor from 2008, but it's very special to me even if it isn't 1080p. Plus for some reason it's 75Hz, something I never would've known if I stopped using it for my PS3 (why it's special, it's always been there through the years) and used it for a PC for once. And yes I still game with it
I remember my first video card was a Geforce 4 MMX440 PCI (not AGP) back in 2003ish, was blown away how much more powerful it was versus integrated graphics.
I remember going from a Voodoo 2. 12 mb card to the GeForce 256. I cannot even explain how awesome that was. My first computer however was an AST 486x33mhz and that had a 512Kb graphic chip installed, i found that out after buing a game that required a total of 1 mb of video ram. oh glory days.
I remember purchasing two Voodoo 2 cards and running them in SLI. What an eye-opener at the time. It's a pity Nvidia killed off 3dfx after their buy-out, then again, they do reign supreme in anti-competitive practices - right up to today.
@@ChrisM541 This isn't arguable, 3dfx did kill itself through anti-competitive practices. They bought out a board partner and tried to bring all manufacturing in-house. They ran into a manufacturing bottleneck and as a result couldn't get their chips to market. As a result, you can still buy new VSA-100 chips for not much money, because chip production vastly over-ran card production. Nvidia isn't the reason 3dfx doesn't exist, 3dfx is the reason 3dfx doesn't exist. And I say that as a 3dfx fanboy.
I remember getting a new PC with the 8800GTX. For that time it was so expensive and huge, me and my friends couldn't believe it. We even used the exact tech demos shown here to test it out. Well and the Tech-Demo called Crysis. Good times.
Remember playing on machines with core 2 Q6600 and two 8800GTXs in SLi with 4GB of RAM in 2007/08. That was a beast - ran maxed out Crysis all day, everyday like it was calc.exe
The FX 5800 Ultra, also called "The Hairdryer". I remembering it for its "Ultra"-loud fans and high temperatures. That was all the rage about it and drove many people to ATI Radeons
Yep. I had a GeForce 2, 3 and 4. Went Radeon around that time, also because the price and performance of Radeon cards were getting very good around that time.
>Nvidia might price these GPUs a little bit higher. But in my opinion, I think it'll start at the same price of $699 because of AMDs competitiveness. Nvidia bros got too cocky.
At this point, I would love for Nvidia to show pretty much the same types of videos as their original to showcase just how much their technology has improved and its capabilities.
The increase of video memory on 8800GTX wasn't that important at the time. Unified shader architecture, the raw number of shaders (128 vs 24+8 for 7800GTX), the DirectX 10 support, and the 384bit bus width were all more impressive.
Great video mate, it was at the end of the video that I recognized the logo. Could not believe you are capable of producing amazing memes and super well-designed and commented long videos too.
Thanks a lot , I’m glad to hear the positive feedback from all of you! i’m known for being a tech memer but topic like these always intrigued me and I made this video
Very interesting video. Its funny how you fail to mention that the geforce 256 was the first card to use hardware T&L (Transform and lighting)....at the time this was its big selling point and the feature that was supposed to make it different from previous cards.
As someone who has owned 75 percent of these cards, the 280, 580, 780, 980, 1060, 1080ti, 2060 and 3090ti all hold a special place in my heart. This doesn't change the fact that the GeForce3 will forever live in my heart as the first real GPU that was presented to me. Now im just waiting for that 4090ti to come out.
@@oozly9291 I've been building computers for 20 years personally and for different reasons. For instance, I ran the 1080ti for years till the 3090ti came out. I had the 2060 in my wife's build. I built my daughters computer with a 1060 that I got from a pawn shop for almost nothing. I now use the 1080ti for a home NAS. The only reason I want to buy the 4090TI coming out is because I plan on buying the Pimax Vision 8K to get both eyes in 4k up to 90hz in games like DCS. Ehh I don't need it but its a hobby along with being an IT person at work.
I am completely disgusted by the price tags. And people bought it. And people will buy the 4080 and the fake 4080. It's gross. If people just told Nvidia how they felt with their wallets, by not buying, then Nvidia would be forced to cut prices to a fair margin.
I remember back in 2016 I built a mid to high-end PC for $1,500 The 980 Ti was like $600 or something. Now we're talking $2,000 and more for the same bracket, with the RTX xx80, non Ti cards alone costing around $1,000. Although it's insane what even mid range PCs are capable of now.
Actually the price makes sense , people back then would have to pay around 600$ for a card that is 10 times slower then the 4090 , so it’s actually a good deal if you think about it
@@high-octane-stunts86bruh. It's like saying the world's first computer is supposed to be 5 cent because it's thousands time weaker than your phone (spoiler: it's not cheaper). That gpu was cutting edge at the time, and pretty much the only gpu back then. It should falls in range with the 70s card right now, so it's a premium product but the demand is just not there yet. For the 4000s cards, only the 4080 above is overpriced. The 4070 is slightly overpriced, but the rest is normal. The most problem lies in the fact that nvidia made them worse for their tier, all cards from 4070 down should have been moved back one tier with the specs. They leveraged better technology and software to cut corners with the 4000 mid and low range, while overpricing the high range.
I just barely got a retro pc setup! An original GeForce 256 except the DDR version! Installing it and realizing windows 64 bit was not used until 2003 made me realize just how much stuff has changed. Really was an interesting time to be alive!
@@angrysocialjusticewarrior depends what you do, the 1080 can run FH5 at like 50FPS, but it doesn't do RTX. Personally, I don't really like "RTX ON" games, it doesn't look quite right.
built my first pc in 2010. I swear games back then when it was used on flash engine was just so good, I never imagined games to be as good as it is today. I didn't even know what was a graphics card I just ran any game I had
Card have changed a lot in past years in so many ways. My first card (3d one) was Voodoo 2 and it was massive upgrade in all technical ways that we just haven't seen like since those days. Power consumption is one interesting too. I still have 6990 lying somewhere and boy that created a lot of heat.
Nice video but from what I know there are a few mistakes (or info that was missing) The 8800 was the most expensive card Nvidia ever made until the Titan cards were released The GTX 480 had a “normal” operation temperature of 105 degrees so the card shown in this video was still running at a lower temperature than most of these cards did Also the 8800 is to my knowledge the first card to support Physx meaning that you would be able to use a 8800 as a dedicated PPU card I really liked your video and loved seeing those old Nvidia demos again It’s just that I noticed those little mistakes and missing information
@@jimmydandy9364 I have to agree I used to have a 8800 gt from EVGA that wasn’t working correctly and they replaced it with a 8800 gts that lasted pretty long but eventually also failed and that got replaced with a GTX460 that also failed and got Replaced by a GTX760 that also failed and I got a GTX950 that is currently in an old system and as far as I know still works … that reminds me … Does EVGA still provide 10 year warranty on there cards ? Because that was the good part about it
@@xxgamergirlxx7917 yeah understandable. The price was as much as I pay for my apartment monthly. If I haven't saved up that much I couldn't afford this GPU. I had a GTX 960 previously and it is definitely a huge upgrade.
@@wayn3h I had a 6800 ultra and then they brought out Pcie. After that mobos stopped supporting AGP and PCIE was kicking AGP ass. I felt abandoned. I remember having ISA and PCI in the same board back in the day. Not too slow for 56k modem, everything else was PCI
I remember getting into PC gaming around 2007 and wishing I could afford two 8800's in SLI. Ended up with just a single 8600GTS, but it still ran CoD4 very well. Ah, to be young and poor. Now I'm old and poor.
Great video! It's great seeing these cards again, even if it makes me feel old 😂One thing that would have been cool is showing the MSRP adjusted for inflation to give an idea of the progress we have seen.
@@theredpanda00 Nice work! Thanks! This helps but the current progress in even more of a perspective 😂 Prices have sharply risen! I miss the value my 1080ti brought once upon a time.
@@theredpanda00 now that we seen this seems the prices fairly stay on the same line well eccept those 30 series covid,miners galore and scalpers combo weve had there
Amazing innovation in such a short time. Such an era we live in. I love my rtx 3060 and finally can stream quad hd and play with ultra at the same time. See you on my next live campaign
I remember watching the human-head launch demo for the 8800GTX and wondering to myself if gfx could get any real than that…I was sooo amazed!…it’s safe to say that after almost 2 decades, we still keep on improving on gfx 😅😅
My very first high-end graphics card was a GeForce3. The absolute best on the market, at the time. I definitely remember those demos too, but I also could've sworn I only paid $300 for it.
Back in the days NVIDIA didn't focus solely on money bc now they do, and he tought the 4080 would be priced at 699 I don't think he was right. Also are only these GeForce bc it feels like there are so many more.
I started collecting GPUs since they’re one of the few parts you can use in almost any newer system no matter the PCIe generation and it’s been awesome to try old 6600s and x300s with more modern games and software
That was interesting ! What got you into pc hardware ? Me personally, it's Crysis in 2007. You needed some knowledge at the time if you wanted to build a pc that "can run Crysis".
@@zwartekatten At the time I could, with an intel core 2 duo 6700 (2.66Ghz X2) 4 gig of ram and the nvidia 7950 GT 512Mb. Average 25 fps at high settings 1024x768. Good old days 😉
@@yan3066 i had to use two Redeonds 580's 512mb each just to get decent frames in 2007 , wasn't until like 2010 with my gtx 590 I was able to run crisis 720p 60
@@yan3066 Puny I tell you! I had a 6600 Quad, 3.25 Gig of ram (32 bits was something) and a 8800 GTS! My brother had the GTX, and made fun of my puny build too : (
Started with a GT440, then R9 270X, GTX 970 and now finally my live long dream, a GPU to max out every setting in every game in 1080p and some even in 1440p and 4K, the AMD RX6700XT. Always wanted a card like this, my entire life ♥️😍
It would be so cool to see performance uplift in 1 game across every gpu (within reason - likely 9800gtx and up) to show truly how far we have come. I think the 4090 is roughly 30x more powerful than the 9800gt
@@noelpatata3948 no dude. Not that much. It may have 3000x more transistors but definitely not 3000x more powerful. Not even close in real world scenarios (fps)
@@noelpatata3948 dude thats still way off base. A 9800gt was capable of 60fps in half life. If what you were saying g is true we would be able to run half life at 60,000 fps. Think about what you are saying its total nonsense. Even if a cpu that powerful existed we would not be anywhere near 60000 fps. Nonsense
10:55 so true I remember this card getting so hot you could literally feel the heat with your hand not even touching it. Grill Force - The way its meant to be grilled 😂🤣😂
I remember the first GPU I ever bought was a GTX 470 in 2014, 2 years later I bought a GTX 1050ti. That thing served me very well until I built a new computer in 2020.
The very first GPU I actually own is RTX 3070, I was able to get it before its price skyrocket and all thanks to my brother who was better in pc stuff than I do.
I have fond memories of the GeForce 256 DDR. Coming from many (oh so many) 3D cards with small improvements then 3DFX and PowerVR (I prefered the PVR over the muddy 3DFX) which were a great upgrade but the GF 256 was an absolute beast with both great visual quality and speed. Although I have bought several ATI/AMD GFX cards since then, I always end up with Nvidia (currently rocking a 3080Ti)
Just an FYI, Nvidia didn't invent PhysX. It was originally from a hardware accelerated Physics card. Nvidia bought the technology and then integrated it into their GPU. I still have a PhysX card in my garage.
Only thing I wish this would have had is the prices at launch, as well as adjusted for inflation. That $249 for the first card in 1999 would have been a $450 card today, for example.
Major correction but tessellation was in retail games using ATi cards 6 years before Fermi. All the way back to Quake 1. It looked great in Morrowind with the 9c patch and UT as well.
The 10xx series cards were the best bang for the buck. The value they provided was astounding and was fairly priced too. Even in 2022 in the most demanding titles they are able to sustain in the 60 fps bracket with proper optimised settings.
Integrated graphics can achieve 60 fps with the shittiest "optimised" settings in the most "demanding" 2022 titles as well. That's not an achievement and a horrible argument. If you never leave 1080p for any game, you can play every single game in existence from here to eternity. 1080p will not be the standard for much longer for a reason.
@@Chris-ey8zf wtf are you talking about LOL, try to launch at least GTA 5 back from 2013 on ultra settings at 60 with "integrated graphics card" this is a fucking nonsense
I remember when the 980 was the hottest thing on the block. I knew one guy who was ducking rich man. Got a SLI setup and we were all jealous of him. Man…. Those were the days
It's amazing to see how far we've come, and how I was already in school when the first consumer GPU was released in 1999. My first ever desktop PC was from CyberpowerPC. Got it from BestBuy. It was the first time I'd ever taken an interest in building and customizing PCs, especially after I switched its old AMD R200 for a GT 730. Been building and modifying ever since. I still remember playing Battlefield during the days leading up to the GTX 900 series being released. GTX 970, 980 and 980 Ti were all anyone in the servers would ever talk about. Fast forward some years and I just replaced my GTX 1060 6GB with a brand new RTX 4070 Super. I'm blown away at what an absolute menace this thing is. Very happy with it.
It would have been nice if you included whether it was AGP 4x, 8x, PCIx, etc - otherwise neat video. Many memories. I bought my first GPU in 2001 and my most recent in 2021.
My Nvidia graphics road started in 2001 in 'house shared computer' with GF2(AGP), GF4(AGP), GF 6600(AGP), then our home computer was upgraded and had my first pci-e card GF 8800, then i was 14/15yo i spend 10s of hours reserching many forum sites for the 'best' hw combo and build my own 'gaming pc' from money that i saved up with gtx280 it held the crown very good as i only replaced when i upgraded to a new platform and gtx660ti then gtx 1070ti and now i have Z690 platform with RTX 3070ti. P.S. 14:03 The original PhsysX engine was actually developed by Ageia, you needed an separate expansion card and special drivers to be able to run this realistic physics engine, then Nvidia just bought the SDK rights in 2008 and integrated in it's system so if you had an nvidia gpu you could just 'enable' the phsysX system and render the realistic physics in demos and few games that implemented this system. So it was already duable in GTX280 times(but back then it still was preferred if you had separate, some older any nvidia card, with at least 256mb of video memory, slotted in, which you in nvidia driver HUB dedicated for the phsysX work.) Then in ~2011 there where first public rumors that nvidia is working on a new physics engine and in 2013 lineup- nvidia released the rehauled 'phsysX' engine so now it was even more realistic and dedicated phsysX cards where gone, as it all was done and could be done with one powerful gpu, the GTX 780. 😉
19:58 Well this aged like fine Milk XD
Lol
$1200
Frr
real
Yep... Can't believe Nvidia made it such a high price! $2500 is totally insane. You can buy a car for that. Will definitely result in less profit for Nvidia. Graphics cards like the RTX 3090 are like 4x cheaper but can still get you 60+ FPS on almost any game even with high settings. Only the hardcore Nvidia fans will buy this untill the price goes down.
The GTX 10 series cards were and still are pretty great. They just refuse to die.
so is the 900 series
@@CN-mo4uq definitely agree with you there. Wife's pc has a gtx 970 that still runs games like Elden Ring and Borderlands 3 with no issues at respectable framerates and detail settings.
1070 1080 and even 1080ti +6600xt/6600 are currently best value gpus by quite alot
@@ValexNihilist with potential FSR 2.0 performance, it'll give these 900 series gpus another life boost.
@@LUL69214 i guess but those gpus should go down too if the 20 30 6700xt+ gpu go down too
I came here expecting memes and left with some knowledge
*What we expected:* Memes with a little knowledge
*What we got:* Knowledge with a few memes
Agreed
Exactly
So true !!
Why memes!
I love how optimistic that $699 price tag for the 4080 was lmao
@enrique amaya hail satan
@enriqueamaya3883 amen
@enrique amaya jesus didnt make the gpu prices lower so no
@enrique amaya following a nonexistent fictional "god" will be a regret
so is following what nvidia says in their 4070 ti demos
that would have been to nice
Back in 1999 and 2000, my high school AV class had a very expensive render station donated to it that would take minutes or even hours to render what a GeForce1 could do in real time. That era was amazing to witness.
Im gay for god @enriqueamaya3883
@enriqueamaya3883 can Jesus render faster than a GeForce1 can?
@an2thea514 unless you ripped out his brain and used it for some form of bio horror human computer then no
@@daddylocs6253 yeah, my joke was funnier when the religious spam reply still existed.
I gotta say, the evolution in the starting price is also astounding.
Yeah, came down here to comment the same. That's almost 10 times the price too.
To be fair, Inflation plays a big chunk into that price, plus there are much more premium components in the more modern GPUs, and finally, they're using absolute bleeding edge tech to make these GPUs.
Still probably overpriced though.
@@arnox4554 Yeah our wallets are bleeding too :P The use of premium components is debatable - NVIDIA has had a history of shoddy QA and quality issues with prematurely failing GPUs, 8800? and others. Companies in a modern world are cutting corners - cost cutting and increasing prices. There is more to a graphics card than the premium caps. I've been had by a company called BFG, life time warranty my arse, now that company is bankrupt, no surprise, were they making quality stuff, NO, all hype, none of the cards I purchased from them lasted more than a few months - Never had issues with other brands like EVGA or ASUS, but I know a lot of casual gamers that had premature failures on their 3000 series GPUs though, it's not uncommon - so the argument of paying more for quality is complete and utter bullshit. We should be paying $1k-$1.5k less on the latest cards. and now with Intel desktop ARC GPUs a total flop and epic fail (just like I predicted) , NVIDIA is in a right position to gauge prices even higher.
@@arnox4554 I hate the argument of "new technology". the first gpu nvidia ever made was 250 bucks and it was the newest technology back then. not only was it infinitely harder to make the first model than everything after that. even if they would be using more premium materials by now, all the extra cost is diminished by all the logistics that are now in place to support the tech industry and which were not even thought of 20 years ago.
just like people nowadays think games cost more because they are harder to make. when in reality just making a game boy game like pokemon you had to overcome so many limitations, it was much harder than making a shitty cod/fifa clone for the hundredth of time
@@Noqtis As I said. I agree that they're still overpriced, but I do still think that something like the RIVA TNT was definitely more cheaply made than the even the 1080 Tis of today. I mean, just look at that damn cooler alone. Those old ass GPUs only needed those dinky little fans with a little heatsink to cool those chips. (Although with that said, Nvidia's chips are starting to get pretty fucking absurd in terms of TDP, especially with these Lovelace chips if the rumors are anything to go by. I WISH we only needed a tiny fan for today's GPUs.)
The demos they had earlier in the 2000s is still very impressive to see what they could do
proto RT 😅
@@dutchvanderlinde4969 How ya figger, Dutch?
Yep, 3Dmark06 firefly forest
Demos are what? Images that took two weeks to render? Layouts that are known ahead of time so they can be rendered easily? Or is it a random image done in real time? Not impressed with demos. Apparently in the last few months some research papers have shown you can do real-time ray tracing on a random scene that actually works quickly, using some new hardware and software, but I doubt it's in production yet.
and back then seeing those the first time was just a whole other level of awe!
"Computer graphics today are the best they've ever been". They are always the best they've ever been.
That's a damn good point!🤡
But this time they're the best they've ever BEAN
They're getting better overtime 💀
That we know off. Aliens be like wtf is this crap.
If you think about it they’re also outdated at the same time because you know something better will be available in the future.
Those old demo showcases are so creative and so nice to look at, even today still impressive, my favourite has to be that green golem in 11:06
seems like graphic demo from early stage of GPU(somewhere in that Nvidia as a new company and 3dfx still exist era) all the way to windows Vista/PS3/xbox360 era wich around 2006-2008 were more foccused to the fact GPU can HW Accelerate 3d rendering,real time reflection and lighting, while past 2008 they more foccused to shows that GPU can create Uncaningly close to real life with subsurface scattering on skin and the likes, 2018 seems like GPU market were shift its foccusses to refine 3d pipeline and introduced HW Ray Tracing
while 2020 till today seems the GPU market were moved to AI and complicated math
Some where quite gratuitous towards the female body, but generally they are very creative.
Oh wow
A serious and informative NIKTEK video
yes a serious video
there are some memes mixed in this though XD
@@NikTek I missed the part where that's my problem
@@Shubbhaaam Spiderman 3 theme starts playing*
@@NikTek I am gonna put some like in your eyes
God I love the old tech demos. One of my first memories of crazy graphics was back in the later 90s as a kid and my sister brought me to one of her friend's houses and he showed me a graphical test of liquid physics that was blood red. I was truly amazed as I've never seen that before and it's been stuck in my mind ever since.
Minus Nalu. They only show a painted version.
Experiencing that in the 90s sound magical
sameeeeeeeeeee
I still love my 1080Ti(s). Still very efficient for the time and outside of RT, they still hold their own in 1440P well enough for most games.
Really wanted a 3080FE but couldn't get a chance to get one until a couple months before the 40 series launch.
I'm gonna keep using mine until I actually need to upgrade. It's probably gonna come down to Nvidia ending driver support for them.
Aperture science1!1!1!1!1!1
Niktek should have covered the 1080 ti after the 1080, that card was REALLY something special
I honestly wanted to cover that card too but I couldn’t find any demos and I wanted to keep this video from being to long, so I mentioned it as the better option when the 2080 was out
Bro i got so lucky i got a 1080 ti for 150 dollars
@@jasperfianen3431 This year?
@@sirbughunter ye
@@sirbughunter no last year sry (december)
My first card was the Gainward GeForce 3 Ti 200 128MB GS. This was also my very first online purchase (through eBay) back in October of 2003. I have kept it to this day, it's a very special piece of personal computer history.
GeForce 3 was bad ass. It was the basis of the original X-Box GPU with the first Halo which pushed Microsoft in to the console market (which was considered nearly impossible at the time vs. Sony, Nintendo, and to some degree Sega.) I'm really really surprised that wasn't mentioned in this video.
Gainward was big in those days. So we're the boxes 😉
What up GeForce 3 Ti200 buddy! I remember when I finally got my hands on one to replace a well past its prime TNT2. It was revolutionary. I feel like those moments in PC hardware are getting few and far between.
My first one was a GeForce 4 MX
I remember that nobody ever managed 60FPS on 4k with the 900 series on demanding games. When the 1070+1080 came out, people finally managed it with a SINGLE GPU. NVIDIA really changed the game with their 1000's lineup. I still own a 1060 and its still kicking.
but any 80 series can do 4k mileage varying from 780 or r9 480 on wards lol. I would say 980 is the bare minimum for 4k now.
@@jenkims1953 3060 can do bit of 4k especially with dlss
I have a gtx 1050 ti
got a gtx 760 in my older system I use for classic gaming and games that can only run on windows 7, still runs great to this day.
@@dylanbasstica8316 VRAM is the biggest limit here. But i recently upgraded to a 3060 😀
I’ve still got a “retro” PC with dual 8800 Ultras in SLI. Each GPU has an aftermarket Theraltake air cooler that used to be a relatively popular upgrade back in 2006-2007. The whole build is in a limited edition fatal1ty case that’s made completely out of thick gauge sheet aluminum. Even though I haven’t turned it on in years, I’ll always keep it around!
I love these generational looks. Would be fun to see the history of Radeon now… 😁
Gone full circle coming back to the Radeon 7000 series.
Agreed. He should make another video!
E2 7110 be like : i can run 2D game and 3D polygonal shapes...but can't emulate or play games from 2010s
ATI/AMD has had some really good products. Like 9800 Pro. Nvidia didn't have anything sensible to compete with. I've never been fanboy and always disliked them. I think my stack of card in all my years is about 50/50 on those. _Always_ buy with good performance price ratio.
r9 280 at around 300 euros.
Never forgetting that beast.
Something oddly captivating about late 90's, early 2000's GPU benchmarks. Fantastic video.
It was how imaginative the benchmark demos were.
I remember these nvidia demos for each generation. I could not believe what graphics they can do, back in the days
Graphics for the consumer market were always a bit behind "the curve".
Looking at the past feels so weird. Especially considering how recent this was. Whatever that guys name was that coined the doubling in processing power… man… I can feel it. This vid made me try to get my 360 to not red ring. And it ran for a couple minutes(thank the heavens) and the technical impressive games felt like the kind of indie game a first timer would make.
It’s called moores law fyi
Times have changed, expected a 40 second meme,but watched the whole 20 minutes of knowledge of something I never got and never gonna get.
All hail to Gamers without GPU🔥
GeForceNow is here to save me
Without a GPU? That would be my first pc, which was a Commodore 64 back in 1984. You used a cassette tape to load the games which took forever. Good times.The first game I played on it was the hunch back of Norte dame. Mesmerising at the time.
Oh I didn't realize this video is 20 minutes long😅. I really enjoy watching it🤩
What was the first GPU that you owned?
A laptop gtx960m ,I sticked with me until the end
@@tahatahiri3124 that's great, did it do well in games that you've played?
Family pc has gt 7200
And my first gpu is gtx 550ti from 2011
GTX 460 1GB
gt 210 and I played crysis
Id love to see an AMD history vid like this
me too 😀
i’ll start working on it very soon!
AMD's history starts with them buying ATI. There was a lot of drama there. AMD and Nvidia were very close before that. ATI was always jumping ahead of Nvidia in tech. They gambled everything on using the XBox360 to jump ahead of Nvidia with universal shaders. XBox360 even had tessellation. The XBox360 was almost an entire generation ahead of ATI's PC offerings and many PC gamers jumped to console gaming. That bankrupted ATI as they didn't get much from XBox sales, and AMD bought them. NVidia's 8000 series was their first series with universal shaders and when they introduced CUDA. It was their first physX capable card too. Their base architecture remained the same until the 400 series, never fully supported DX10.1. Rather they jumped to DX11 with the 400 series. AMD however stayed on the same architecture and never fully supported DX 11. They pushed conspiracy theories rather than innovating. Nvidia went so far as to enable multi-threaded rendering in games not coded for it in a driver update. Devs weren't coding for it because AMD didn't support it. DX12 was just Mantle an APU designed for AMD hardware, with some name changes. They still had the same slides in the documentation. DX12 was basically Glide. It didn't improve graphics, but slowed down Nvidia cards. Nvidia had to change their cards to work better with an older way of doing things, as they were doing universal pipes, rather than a single hardware scheduler.
@@siali85
@@NikTek thanks
The most notable change between the 7 and 8 series was not the video memory at all. It was the change to a unified shader architecture from the previously fixed vertex and rasterisation pipelines. It was an enormous shift and one which AMD were first to market with before Nvidia and was used in the Xbox360 hardware.
it cant be understated at how how big the 8800 gtx was for its time.
Crysis was released shortly after its launch, and the card was an absolute monster for anything at the time that was thrown at it. It was legendary and so many people had them.
Yeah he kind of just blew past that card. I would consider it one of the most significant in Nvidia's history.
It was a beast when it came out. A game changer indeed. I was so excited when went to pick mine up.
Yep, I remember a time when a top-tier graphics card was $500. Still have two of 'em.
*Also made great heaters in the Winter. 🤣
8800 gtx and 1080ti are easily the greatest nvidia gpus to ever exist in history. he probably overlooked 8800 gtx since it came out over a decade ago.
8800 gtx was peak gaming for 90s teenagers
Those old 90’s Benchmarks just hit so much different idk how to describe it.
I remember upgrading my graphics to the 256 in order to get decent frames in Battlefield.
@@ballstothewall38 Ahh the GeForce 256. I remember my brothers PC had one when he was first getting me into PCs. Back when pulling 60 FPS was akin to 120 nowadays.
@@shrekwes4957stable 60 is still impressive standards today imo
Interesting seeing these price predictions and how waaaaay off they are for the 40 series.
He didn't take into account that Biden would toss 3 trillion dollars into an already overheated economy and blow it up.
I laughed when I saw the $699-750 prediction now that it’s out and is $1200.
@@pillington1338 Looking on Amazon, I don't see any 4080s. They have 4090s, priced from $2200 to $2500. That's insane in my book.
@@pillington1338 yeah he was way off but didn't include the problems that have occurred since Biden took over.
@@Vector_Ze you aren't seeing any 4080s because they haven't been released yet
"Today I'll sleep early."
Me 2:22 AM:
My dad bought me a Geforce2 and I used it for years.
As a student I didn't really have much cash so I went for budget AMD GPU after that for many years.
Eventually I got my first job and bought a GTX 580, then another GTX 580 for SLI and then another GTX 580 for tri SLI.
Then I got married and bought a property. With a mortgage on my back I couldn't afford to upgrade for a long time. Until that marriage broke off and we sold/split all our assets. Quit my job, work casual for 3 years before going back into full time work and started earning double of what I used to make.
With those additional cash I finally upgraded to RTX 3090 in 2020.
Marriage is a scam man. ALWAYS keep them in the girlfriend category. That marriage contract is nothing more than a will... with the difference being that a regular will gives away your assets when you die, and a marriage contract gives away your hard earned assets when you divorce.
See how life can get great again when you lose 180 pounds of dead weight. Remember kids never get maried, work on yourself and you hobbies and dont look back. Enjoy your 3090 you deserve it, she can serve you for many years.
The thing that strikes me is how imaginative the demos were back in the day. Such unique art style we can't seem to replicate today.
It's what most early cgi looked like
You can clearly see they weren't really art-directed and often made by companies that didn't specialize in these types of visuals so they came out very quirky and unusual and directed towards certain effects they wanted to showcase.
@@XpRnz they looked much better back then!
Some of it felt eerie but unique, it feels nostalgic like the ps1/2 Era
Luna, my all time favourite.
I remember asking my dad for a 6800 ultra, but prices in my country was super high, so I ended up with a 6200 AGP instead lol Damn, time flies
just like this current GPU shortage that has been happening those past 2 years
@@NikTek 2 years ?! Wow, time flies for sure.
I also had an AGP model of the 6200. That was quite a bad card but defined my childhood and powered my favorite games
@@LNCRFT Yes, it was so bad that I had to go to my friends house to play Crysis after school. I remember he had a 7600GTS, and we had to play on low settings. Still, I was jealous.
hahahaha quite a reality check, 6200 was the worst possible new card back then xd
I love repeatedly coming back to this video to see how far we've come with Graphics and Technology
It's crazy how far technology has come. Can imagine in 2030
The MU 4 F X-10 Tendy 🍌
@@adrianafamilymember6427 what
@@adrianafamilymember6427 3:57 "Be nice you need more friends" lol
Nvidia prolly got some crazy shit in the background that they dont want to release yet
MSRP 1999 dollars, TDP 1400W in 2030
256 / first gpu
2 / real ttime lighting , per pixel lighting , higher triangles
3 / shading effect , reflective texture , complex facial geometry
4 ti 4600 / real time volumetric fur , anti-aliasing
fx 5800 ultra / cinematic depth of field blur , lightining effect
6800 ultra / complex vertex and fragment shaders
7800 /
8800 / bouncy physics , realistic face
9800 /
280 / 1 billion transistors
480 / tessellations
580 /
680 / hair work system
780 / physiscs ,destructuion,smoke effects
980 / real time global illuminations
1080 / realistic and photographic
2080 / rtx real time ray tracing
3080 / double trx
4080 / 75%
really
well 1080, not really. It was pretty much just faster and more efficient.
@@TacticalPhoenixYT meaning new software YAY!!!
@@EinSwitzer The real diff was with turing, due to its RT cores. But older cards can still RT, just at worse fps.
@@TacticalPhoenixYT I'm aware of this I Was able to get my 1080Ti Msi Seahawk corsair edition to clock 2.1 max 1.987ghz average max ram based on values then went into games that supported it 60 to 100fps just . like . 2080Ti . searching for missing ram chip.. and channels to more so tech the 1080ti could have 2x the ram based on thread to register... relax I give it all to nvidia or intel who ever's parts I use to get more iNPUT I give them the findings.
One thing that's not conveyed in this video is the increase in _resolution_ that each generation was capable of (as in, a card from the next generation being able to run the same demos and games as the previous generation but at a significantly higher resolution). This is quite an important factor, besides the visuals.
And I'm here still using a 1440x900 monitor from 2008, but it's very special to me even if it isn't 1080p. Plus for some reason it's 75Hz, something I never would've known if I stopped using it for my PS3 (why it's special, it's always been there through the years) and used it for a PC for once. And yes I still game with it
I remember my first video card was a Geforce 4 MMX440 PCI (not AGP) back in 2003ish, was blown away how much more powerful it was versus integrated graphics.
I remember going from a Voodoo 2. 12 mb card to the GeForce 256. I cannot even explain how awesome that was. My first computer however was an AST 486x33mhz and that had a 512Kb graphic chip installed, i found that out after buing a game that required a total of 1 mb of video ram. oh glory days.
I remember purchasing two Voodoo 2 cards and running them in SLI. What an eye-opener at the time. It's a pity Nvidia killed off 3dfx after their buy-out, then again, they do reign supreme in anti-competitive practices - right up to today.
I remember needed to buy a 4mb stealth diamond gpu to run rainbow 6 back in 98 😂
@@ChrisM541 3dfx killed itself through anti-competitive practices, seeing them as a victim of Nvidia is the wrong take IMO.
@@phillycheesetake Bullsh#t, lol!
@@ChrisM541 This isn't arguable, 3dfx did kill itself through anti-competitive practices.
They bought out a board partner and tried to bring all manufacturing in-house. They ran into a manufacturing bottleneck and as a result couldn't get their chips to market. As a result, you can still buy new VSA-100 chips for not much money, because chip production vastly over-ran card production.
Nvidia isn't the reason 3dfx doesn't exist, 3dfx is the reason 3dfx doesn't exist. And I say that as a 3dfx fanboy.
I remember getting a new PC with the 8800GTX. For that time it was so expensive and huge, me and my friends couldn't believe it. We even used the exact tech demos shown here to test it out.
Well and the Tech-Demo called Crysis.
Good times.
That’s great, the gold old times :)
I had the plebian 8800 GTS, the 320mb model. I played through Crysis on it, but the poor thing was practically melting down by the end.
Remember playing on machines with core 2 Q6600 and two 8800GTXs in SLi with 4GB of RAM in 2007/08. That was a beast - ran maxed out Crysis all day, everyday like it was calc.exe
“Hey I got a gpu”
“Can it run crysis tho?”
I remember MW on a new P75.
The FX 5800 Ultra, also called "The Hairdryer". I remembering it for its "Ultra"-loud fans and high temperatures. That was all the rage about it and drove many people to ATI Radeons
Yep. I had a GeForce 2, 3 and 4. Went Radeon around that time, also because the price and performance of Radeon cards were getting very good around that time.
@@Gatorade69 Radeon card is always best
>Nvidia might price these GPUs a little bit higher. But in my opinion, I think it'll start at the same price of $699 because of AMDs competitiveness.
Nvidia bros got too cocky.
At this point, I would love for Nvidia to show pretty much the same types of videos as their original to showcase just how much their technology has improved and its capabilities.
Trust me or not, i just wanted to check google the evolution of Nvidia GPU’s LMAO 😂
Lol XD
Wait 2hours ago?
Even the vid is uploaded 1 hour ago.
@@captainvenom7252 channel member ig
@@FrostyTheOne_ unlisted viewers ig
Talking about supply and demand :)
The Evolution of Nvidia GeForce Graphics....249USD-->>1999USD
The 4080 is almost double the price you figured. Wild
i love these old demos, they're so cool for some reason
yeah the nostalgia with them is so great
some of them were creepy as hell
@@bigboat8329 true true
try searching demoscene, those that started with 64kb files, they are pretty amazing
@@bigboat8329 you meant nniiiice... simps lol
The increase of video memory on 8800GTX wasn't that important at the time. Unified shader architecture, the raw number of shaders (128 vs 24+8 for 7800GTX), the DirectX 10 support, and the 384bit bus width were all more impressive.
That 384bit bandwidth blew everything else out of the water, even for a few years after
Great video mate, it was at the end of the video that I recognized the logo. Could not believe you are capable of producing amazing memes and super well-designed and commented long videos too.
Thanks a lot , I’m glad to hear the positive feedback from all of you! i’m known for being a tech memer but topic like these always intrigued me and I made this video
Am I the only one who wants to see the rest of the demo videos. 12:15 I came for the graphics cards but I stayed for the videos.
Very interesting video. Its funny how you fail to mention that the geforce 256 was the first card to use hardware T&L (Transform and lighting)....at the time this was its big selling point and the feature that was supposed to make it different from previous cards.
As someone who has owned 75 percent of these cards, the 280, 580, 780, 980, 1060, 1080ti, 2060 and 3090ti all hold a special place in my heart. This doesn't change the fact that the GeForce3 will forever live in my heart as the first real GPU that was presented to me. Now im just waiting for that 4090ti to come out.
Oh yes, getting my first real gaming card in 2000 was a massive leap in performance and it truly showed me how the hardware is progressing
Why do you waste that much money on sometimes worse cards🤦♂️?
@@oozly9291 I've been building computers for 20 years personally and for different reasons. For instance, I ran the 1080ti for years till the 3090ti came out. I had the 2060 in my wife's build. I built my daughters computer with a 1060 that I got from a pawn shop for almost nothing. I now use the 1080ti for a home NAS. The only reason I want to buy the 4090TI coming out is because I plan on buying the Pimax Vision 8K to get both eyes in 4k up to 90hz in games like DCS. Ehh I don't need it but its a hobby along with being an IT person at work.
Gimme one
@@learningthehardway Let's all be real about why we really buy new cards: bragging rights.
That MSRP for the 4090 is $1600 what an absolutely insane time to be alive.
I am completely disgusted by the price tags. And people bought it. And people will buy the 4080 and the fake 4080. It's gross. If people just told Nvidia how they felt with their wallets, by not buying, then Nvidia would be forced to cut prices to a fair margin.
I remember back in 2016 I built a mid to high-end PC for $1,500 The 980 Ti was like $600 or something. Now we're talking $2,000 and more for the same bracket, with the RTX xx80, non Ti cards alone costing around $1,000. Although it's insane what even mid range PCs are capable of now.
@@frtzkng No GPU should cost over £600. Period.
Actually the price makes sense , people back then would have to pay around 600$ for a card that is 10 times slower then the 4090 , so it’s actually a good deal if you think about it
@@high-octane-stunts86bruh. It's like saying the world's first computer is supposed to be 5 cent because it's thousands time weaker than your phone (spoiler: it's not cheaper). That gpu was cutting edge at the time, and pretty much the only gpu back then. It should falls in range with the 70s card right now, so it's a premium product but the demand is just not there yet. For the 4000s cards, only the 4080 above is overpriced. The 4070 is slightly overpriced, but the rest is normal. The most problem lies in the fact that nvidia made them worse for their tier, all cards from 4070 down should have been moved back one tier with the specs. They leveraged better technology and software to cut corners with the 4000 mid and low range, while overpricing the high range.
I just barely got a retro pc setup! An original GeForce 256 except the DDR version!
Installing it and realizing windows 64 bit was not used until 2003 made me realize just how much stuff has changed.
Really was an interesting time to be alive!
Never expected a long niktek vid but I'm not dissapointed this is really good
I’m so glad to hear that thanks!
The graphics in 2012 at 12:45 still looks better than nearly all game characters today.
Thats awful alot so that much they used Nvidia characters went used.
Still using my 1080, 10 series was the last time I recall Nvidia ever being reasonable.
ehhhh, the 1660 SUPER is really good, it just doesn't love 1080p. it is really great for VR and 1440p though.
1080 is ancient bro. hopefully you finally catch up with modern times by upgrading to an RTX 4070 when it comes out.
@@angrysocialjusticewarrior depends what you do, the 1080 can run FH5 at like 50FPS, but it doesn't do RTX. Personally, I don't really like "RTX ON" games, it doesn't look quite right.
@@genesisgaming3756 Control looks great. That is the only one that is tho.
@@angrysocialjusticewarrior Lol, judging by your avatar and name I'm guessing you're joking, in which case thanks for the laugh!
My first graphics card was a voodoo 2. See GL Quake for the first time was absolutely mind blowing. The step up from non base quake was unreal.
I've watched this evolution myself. Been building my own gaming rigs since 95. This a great trip down memory lane.
built my first pc in 2010. I swear games back then when it was used on flash engine was just so good, I never imagined games to be as good as it is today. I didn't even know what was a graphics card I just ran any game I had
@@f0x106 my first pc had a GPU with 4MB of dedicated ram. 4MB 😆 Holy crap times have changed.
Also a great trip down the ol' wallet. 😅
Card have changed a lot in past years in so many ways. My first card (3d one) was Voodoo 2 and it was massive upgrade in all technical ways that we just haven't seen like since those days. Power consumption is one interesting too. I still have 6990 lying somewhere and boy that created a lot of heat.
Nice video but from what I know there are a few mistakes (or info that was missing)
The 8800 was the most expensive card Nvidia ever made until the Titan cards were released
The GTX 480 had a “normal” operation temperature of 105 degrees so the card shown in this video was still running at a lower temperature than most of these cards did
Also the 8800 is to my knowledge the first card to support Physx meaning that you would be able to use a 8800 as a dedicated PPU card
I really liked your video and loved seeing those old Nvidia demos again
It’s just that I noticed those little mistakes and missing information
The most expensive and ironically the ones that failed the most prematurely.
@@jimmydandy9364 I have to agree
I used to have a 8800 gt from EVGA that wasn’t working correctly and they replaced it with a 8800 gts that lasted pretty long but eventually also failed and that got replaced with a GTX460 that also failed and got Replaced by a GTX760 that also failed and I got a GTX950 that is currently in an old system and as far as I know still works … that reminds me … Does EVGA still provide 10 year warranty on there cards ? Because that was the good part about it
The gt 6800 will always be the best in heart i had the best days of my life with it
I got into pc gaming in 2001 but never actually had a pc with a graphics card until 2008. I currently have a RTX 3070 in my pc.
Planning to upgrade my PC soon from a 1050 to a 3070
How's ur experience with the 3070?
@@xxgamergirlxx7917 I have laptop omen 16 with rtx 3070 and its overkill.
I've a RTX 3070 Ti. it's a monster when I play games at 1440p@144fps.
@@ferryry3358 wish I could get a Ti myself.
But the price is already getting out of my budget.
@@xxgamergirlxx7917 yeah understandable. The price was as much as I pay for my apartment monthly. If I haven't saved up that much I couldn't afford this GPU. I had a GTX 960 previously and it is definitely a huge upgrade.
AGP to PCI-E was a massive step too and often overlooked. Bandwidth increase was no joke.
Not really at first, it took a few gens before pci-e was any faster than agp 8x
@@wayn3h I had a 6800 ultra and then they brought out Pcie. After that mobos stopped supporting AGP and PCIE was kicking AGP ass. I felt abandoned.
I remember having ISA and PCI in the same board back in the day. Not too slow for 56k modem, everything else was PCI
I love the hardware from the 2000's when I started Some of the tech is still indestructible today
The Geforce 4200 and the 970 have been the best bang for your money in all Nvidias history.
I remember getting into PC gaming around 2007 and wishing I could afford two 8800's in SLI. Ended up with just a single 8600GTS, but it still ran CoD4 very well.
Ah, to be young and poor. Now I'm old and poor.
Your last few sentences made me lol
The GeForce 4 Ti was a beast. It really did very well for quite a long time
Great video! It's great seeing these cards again, even if it makes me feel old 😂One thing that would have been cool is showing the MSRP adjusted for inflation to give an idea of the progress we have seen.
I gotcha!
these are ROUGH calculations & USD only
geforce 256: 444.96
geforce 2: 603.37
geforce 3: 839.31
Ti 4600: 660.29
FX 5800 ultra: 645.58
6800 ultra: 786.44
7800 GTX: 913.10
8800 GTX: 884.57
9800 GTX: 413.44
GTX 280: 897.41
GTX 480: 681.28
GTX 580: 681.28 (same as GTX 480)
GTX 680: 647.05
GTX 780: 829.40
GTX 980: 690.40
GTX 1080: 743.02
RTX 2080: 828.73
RTX 3080: 804.06
I hope I did all this correctly 😅 feel free to correct any mistakes
@@theredpanda00 Nice work! Thanks! This helps but the current progress in even more of a perspective 😂
Prices have sharply risen! I miss the value my 1080ti brought once upon a time.
@@theredpanda00 now that we seen this
seems the prices fairly stay on the same line
well eccept those 30 series covid,miners galore and scalpers combo weve had there
Some of those old Nvidia demos were straight NIGHTMARE FUEL
4:13 Imagine flexing u have a 4060ti early
Oh my god niktek got a sponsor for this first time in his life
Amazing innovation in such a short time. Such an era we live in. I love my rtx 3060 and finally can stream quad hd and play with ultra at the same time. See you on my next live campaign
that’s great! it has been serving you very well
But it's nothing compared when 3D was new thing. I got 3DFX Voodoo2 as my first 3D card and I haven't seen leap of that kind ever since that.
VoodooRush, Voodoo2, GeForce2GTS, GEforce4Ti, GTX8800, GTX260, GTX460 and still rocking my GTX980.
Man, those early demos brought back some memories
8:49 a hard slap in to the frog face :)
poor froggy
:( 🐸
I remember watching the human-head launch demo for the 8800GTX and wondering to myself if gfx could get any real than that…I was sooo amazed!…it’s safe to say that after almost 2 decades, we still keep on improving on gfx 😅😅
Looks pretty good even by today's standards
My very first high-end graphics card was a GeForce3. The absolute best on the market, at the time. I definitely remember those demos too, but I also could've sworn I only paid $300 for it.
@enriqueamaya3883 Bro this isn't Africa stop trying to convert people
Back in the days NVIDIA didn't focus solely on money bc now they do, and he tought the 4080 would be priced at 699 I don't think he was right.
Also are only these GeForce bc it feels like there are so many more.
This video was amazing! Just make sure not to give up the memes. They are super funny
sure, they will keep coming too!
I started collecting GPUs since they’re one of the few parts you can use in almost any newer system no matter the PCIe generation and it’s been awesome to try old 6600s and x300s with more modern games and software
I have also been collection them. Mostly because I can't throw anything away. And I've built or upgraded quite a few PCs in the last nearly 25 years.
That was interesting ! What got you into pc hardware ?
Me personally, it's Crysis in 2007. You needed some knowledge at the time if you wanted to build a pc that "can run Crysis".
but still, can it run crysis ?
@@zwartekatten At the time I could, with an intel core 2 duo 6700 (2.66Ghz X2) 4 gig of ram and the nvidia 7950 GT 512Mb. Average 25 fps at high settings 1024x768. Good old days 😉
@@yan3066 i had to use two Redeonds 580's 512mb each just to get decent frames in 2007 , wasn't until like 2010 with my gtx 590 I was able to run crisis 720p 60
Elden Ring
@@yan3066 Puny I tell you! I had a 6600 Quad, 3.25 Gig of ram (32 bits was something) and a 8800 GTS! My brother had the GTX, and made fun of my puny build too : (
Started with a GT440, then R9 270X, GTX 970 and now finally my live long dream, a GPU to max out every setting in every game in 1080p and some even in 1440p and 4K, the AMD RX6700XT. Always wanted a card like this, my entire life ♥️😍
Started with canopus pure3d 6mb
@@fredriksvard2603 technology has come so far, I love it :)
It would be so cool to see performance uplift in 1 game across every gpu (within reason - likely 9800gtx and up) to show truly how far we have come. I think the 4090 is roughly 30x more powerful than the 9800gt
more than 30x, probably arround 3000 or more
@@noelpatata3948 no dude. Not that much. It may have 3000x more transistors but definitely not 3000x more powerful. Not even close in real world scenarios (fps)
@@Vartazian360 maybe a shot s bit out of reality but it isn't less than 1k times more powerful
@@noelpatata3948 dude thats still way off base. A 9800gt was capable of 60fps in half life. If what you were saying g is true we would be able to run half life at 60,000 fps. Think about what you are saying its total nonsense. Even if a cpu that powerful existed we would not be anywhere near 60000 fps. Nonsense
10:55 so true I remember this card getting so hot you could literally feel the heat with your hand not even touching it.
Grill Force - The way its meant to be grilled 😂🤣😂
nice niktek, your becoming a information channel, good job my friend. keep making like this
i’m glad to hear that, thank you and of course I won’t give up the memes :)
@@NikTek never give up on your memes
I remember the first GPU I ever bought was a GTX 470 in 2014, 2 years later I bought a GTX 1050ti. That thing served me very well until I built a new computer in 2020.
i still have a 1050 ti lol
3:34 My Intel HD 620 be like: Finally found the opponent 😁
The very first GPU I actually own is RTX 3070, I was able to get it before its price skyrocket and all thanks to my brother who was better in pc stuff than I do.
Stop making us feel old. I actually had the first GPU. That 256 was a very big deal and it was shocking how quickly nVidia presented a replacement.
@@SIPEROTH I might be as old as u are, 😁.. but hey old means gold,cheers
It's also my first but I got it used so it was a good price
unfortunantly the MSRP on the 4080 16 GB was not equal to your predictions. :(
what you think about 4080 12gb should be 4070
Your estimated price of the 4080 was hilariously optimistic.
I have fond memories of the GeForce 256 DDR. Coming from many (oh so many) 3D cards with small improvements then 3DFX and PowerVR (I prefered the PVR over the muddy 3DFX) which were a great upgrade but the GF 256 was an absolute beast with both great visual quality and speed. Although I have bought several ATI/AMD GFX cards since then, I always end up with Nvidia (currently rocking a 3080Ti)
6:39 origin of "🤓"
Just an FYI, Nvidia didn't invent PhysX. It was originally from a hardware accelerated Physics card. Nvidia bought the technology and then integrated it into their GPU. I still have a PhysX card in my garage.
Yeah the Ageia PhysX, i have one 2nd hand
Only thing I wish this would have had is the prices at launch, as well as adjusted for inflation. That $249 for the first card in 1999 would have been a $450 card today, for example.
Major correction but tessellation was in retail games using ATi cards 6 years before Fermi. All the way back to Quake 1. It looked great in Morrowind with the 9c patch and UT as well.
The 10xx series cards were the best bang for the buck. The value they provided was astounding and was fairly priced too. Even in 2022 in the most demanding titles they are able to sustain in the 60 fps bracket with proper optimised settings.
They also run pretty cool which can help extend their life span.
Integrated graphics can achieve 60 fps with the shittiest "optimised" settings in the most "demanding" 2022 titles as well. That's not an achievement and a horrible argument. If you never leave 1080p for any game, you can play every single game in existence from here to eternity. 1080p will not be the standard for much longer for a reason.
@@Chris-ey8zf wtf are you talking about LOL, try to launch at least GTA 5 back from 2013 on ultra settings at 60 with "integrated graphics card" this is a fucking nonsense
@@Chris-ey8zfit's amazing how wrong you are 😂. Comparing a 1080ti to integrated graphics is room temp IQ
I remember when the 980 was the hottest thing on the block. I knew one guy who was ducking rich man. Got a SLI setup and we were all jealous of him. Man…. Those were the days
Congratulations 👏 you have just won a price 💰DM me now to claim your rewards ✅👍..
It's amazing to see how far we've come, and how I was already in school when the first consumer GPU was released in 1999. My first ever desktop PC was from CyberpowerPC. Got it from BestBuy. It was the first time I'd ever taken an interest in building and customizing PCs, especially after I switched its old AMD R200 for a GT 730. Been building and modifying ever since. I still remember playing Battlefield during the days leading up to the GTX 900 series being released. GTX 970, 980 and 980 Ti were all anyone in the servers would ever talk about.
Fast forward some years and I just replaced my GTX 1060 6GB with a brand new RTX 4070 Super. I'm blown away at what an absolute menace this thing is. Very happy with it.
3:45 - seriously sounds like you say "shitting effects"
3:57 "Be nice you need more friends" lol
It would have been nice if you included whether it was AGP 4x, 8x, PCIx, etc - otherwise neat video. Many memories. I bought my first GPU in 2001 and my most recent in 2021.
What was your first and current GPU?
My Nvidia graphics road started in 2001 in 'house shared computer' with GF2(AGP), GF4(AGP), GF 6600(AGP), then our home computer was upgraded and had my first pci-e card GF 8800, then i was 14/15yo i spend 10s of hours reserching many forum sites for the 'best' hw combo and build my own 'gaming pc' from money that i saved up with gtx280 it held the crown very good as i only replaced when i upgraded to a new platform and gtx660ti then gtx 1070ti and now i have Z690 platform with RTX 3070ti.
P.S. 14:03 The original PhsysX engine was actually developed by Ageia, you needed an separate expansion card and special drivers to be able to run this realistic physics engine, then Nvidia just bought the SDK rights in 2008 and integrated in it's system so if you had an nvidia gpu you could just 'enable' the phsysX system and render the realistic physics in demos and few games that implemented this system.
So it was already duable in GTX280 times(but back then it still was preferred if you had separate, some older any nvidia card, with at least 256mb of video memory, slotted in, which you in nvidia driver HUB dedicated for the phsysX work.)
Then in ~2011 there where first public rumors that nvidia is working on a new physics engine and in 2013 lineup- nvidia released the rehauled 'phsysX' engine so now it was even more realistic and dedicated phsysX cards where gone, as it all was done and could be done with one powerful gpu, the GTX 780. 😉