You're such a hidden gem on here. So easy to watch and listen to. Full of info, highly entertaining, great editing, all while making it look easy even though it's a lot of work. Love your accent and humour as well. I'm from the UK and have no accent! Really appreciate your efforts and time 👍
I love you accent, your witty remarks, and your humor, man!! I find all your videos interesting and I've been binge-watching them recently! Keep making more!!
Hey Sir, some months ago we talked on discord chat about the GeForce 6600 pcie (non gt). Using a win 7 driver, I got it running on win 10 and played the current version of lotr (lord of the rings) online with it. Glad to see you making a video about it, but you did not mention its usability in Windows 7 and 10.
Hey! It's great, I never got to use such older video cards in Win 10, only CPUs, but maybe in the future :) I did not mention usability in newer OSes, because...time.
Thanks so much for that video as I have the club GT version for my xp build which I'll hopefully finish this weekend. I also took your advice & got a 7900gt to go with the e8600. I also bought a 8400gs for £5, I'm still not sure why I bought it, I think I have a problem. Thanks again for your efforts & your playlists are still my go to sleep aid to stop my mind wandering. Thanks
The 6600 GT was an amazing step up from the normal 6600, with the only difference being in clock speeds. Same amount of pipelines, same memory bus, only clocks.
Yep. I had a Inno3D 6600GT in the late year 2004 and with some OC its almost equal to a 6800 performance. I used with a AMD Duron 600 running at 1000Mhz... i loved this setup back then. In 1280x1024 it was enough in every game.
I actually had this card, bought it around 2006 or so, my PC had 3MB VRAM and a CPU of 900Mhz, and about 256MB RAM, so going from 3MB VRAM To 128MB VRAM was... crazy... I played tons and tons and tons of games... but was still behind the graphics cards of that time.. and still dreamt to play the newest games.. but at least I could play GTA Vice City and later on GTA San Andreas but had more RAM at that time, about 512MB RAM... I fried that PC when I tried to install 1GB RAM while the computer was on ( huge mistake ), I think I also had the wrong RAM. After begging my parents for years and years they bought me another PC but was weak and once again I was behind those times... and could only play older games but I loved and love older games, Age of Empire, Rise of nations and many other.. Someone give me a time machine to relive those gold times!!!!
I remember getting a custom Pentium 4 build with a 5 series NVidia card for free from a friend. It was so nice compared to my slow Celeron Dell office pc lol. Later got a 6600 and maxed out ram but still wanted a bit more as the 360 and ps3 were already out. I found a used C2D e6600 custom build and later bought an evga 9800gt sc. Wow, what an upgrade! This build served me well for a while. An upgrade to a q6600 would've given it a bit more life but I think the upgrade to x58 was more worth it. Barely upgraded from that build a year ago :D Cheers!
Hello, the 6xxx series are such a great value for fast Athlons XP and Pentium 4 AGP systems, quite amazed about that huge CPU bottleneck, so thanks for adding that test. What I love about these passive cards is how easy is to add a 80mm fan and end up with a bulletproof cool and quiet solution but without a lot of fresh air they get toasted. 😂
Wish i could remember what PC i had 21 years ago, but i do remember playing Underground at around 20-30 fps. Back then, the fact that the game ran, it was all we needed. Framerate over 30 fps was just a bonus :)
@@MidnightGeek99 :) Never had a FX5200, i did have an Athlon later. But i think i was on some kind of a Duron up until 2004-2005 when i got an Athlon. The GPU is a blank, wish i have kept the receipts from back then :D
I love this card... I have two agp geforce 6600gt in my collection. Honestly it is overpowered for most of the old platforms with AGP slot. The PCIE version is real wonder of value, and it supports SLI !!
I have this card sitting on my desk! Still works, though if i wanted to use it i had to point some kind of fan on the cooler, otherwise it would just shut down under load, i did mess around with it recently and it does almost manage to run TF2 at 10 or 20 fps with the lowest possible settings (without modifying the game files) at 800x600 however it does freeze every once in a while, that 256mb vram is definitely not helping it ;)
What a great idea it was to decouple the ROPs from the Shaders/TMU's. It changed the game forever. I didn't own one at the time, but we sold tons of them at the shop I worked at. EDIT: Does your 775dual-VSTA have the AGP bug? Does it list zero megabytes in AGP memory in 3dmark2001? I sold my 775dual-VSTA because I was tired of compromises. If you don't want the bug you have to run the earliest bios version and forgo any benefits of later bios updates. I'm done with VIA chipsets passed KT600.
VIA were at the top of their game with the later revisions of the KT266 and the KT333, KT400 and KT600. If you want to run socket A and something else than nForce, those are great. There are also some options for socket 370 whith support 1.5 GB RAM, unlike i815, which is limited to 512 MB. A bit slower than the Intel chipsets, but a fine alternative if you want more memory or ISA slots.
Back in 2005, I built my first PC to play Counter-Strike 1.6 and Day of Defeat 1.3 on windows XP with an Athlon 64 3000+ s754 (stock cooler), I think I had 1 or 2 1GB DDR-400 sticks of RAM (3x1GB later on), an ASUS K8N motherboard and a XFX Geforce 6200 256MB DDR2 (64bit memory bus width) AGP x8 graphics card with passive cooling. Not long after I added the cheapest aftermarket active cooler I could find for the geforce 6200. It sucked ass but it prevented overheating and throttling. All that on the cheapest case, power supply and HDD I could get. Do you think you have what it takes to endure that pain?
Yeah, this card gets pretty hot. It easily reaches 80C for me in Thief Deadly Shadows (you should add it to your benchmark list since it's such a demanding game for its time) and when I tried Dark Messiah with HDR on, it went up even more 🔥
@@MidnightGeek99 Sooner or later, all GPUs die. Speaking of dying GPUs, had a GTX 460 die on me while playing STALKER SoC. Thing with STALKER games is that their in-game vsync doesn't work, so the fps is uncapped. You can get something like 1000 fps in menus and a few hundred while playing which causes stress and most importantly heat on the GPU. Since then I always try to cap the fps to my monitor's refresh rate :)
Hi MNG, Love your chanel! I have your same condition 😂 Maybe you can make some video on 2000 obscure great games ? We all have our favorites some like robot arena 2, age of mythology, neighbors from hell ecc Much love from your Serbian neighbor! 😊
There are certain aftermarket passive coolers that'd be beefy enough to properly handle the 6600. For example the Thermaltake Schooner...if modern and high quality thermal paste is used during assembly. FX 5700 temperatures were fine (I could withstand touching the copper). I'd only get a 6600 now, if it was by buying the original 6200 and unlocking it. 6600's core clock is the same as the standard 2D clock of the (AGP) 6600GT BTW.
Oh, I see a Zalman CNPS 9500 and Corsair XMS2 memory on a Gigabyte P35 board. Good memories. My RAM died at some point, but the board is still powering my midrange retro system and the Zalman had to make room for a better cooler at some point. And a passive card? with a big cooler? That basically screams for a case fan and overclocking. Oh, and I would probably switch the board/CPU combo. The P35 supports FSB1333 chips just fine, while the 775Dual-VSTA only has official support for FSB1066. So if you intend to oc, both boards should be no limiting factor for the other CPU. A well running Gigabyte P35 should be able to do 500 MHz/2000 MT FSB just fine, which could bring the E6600 to 4.5 GHz. But 3.6 GHz on 9x 400 is probably a smoother option with less need for cooling.
Jedi Outcast shows how great the Quake 3 engine looks even today. Sure, texture resolution is lower, models have fewer polygons. But the aesthetics are just amazing. I would probably use 2x AA on 1024x768 or 1152x864. I notice jaggies and flickering a lot, so smoothing those edges are pretty much a must have for me, but especially on a CRT lowering resolution frees some GPU performance without impacting visual quality too much. And on 4:3 aimed games of the time, I would absolutely prefer a cRT. For NFS Underground I would probably just use a 60 fps limiter. Since you're already using Afterburner + RTSS, the software for that is already installed. That way the game won't speed up when running >60 fps, but the stutter and input lag are gone. Something I notice very much in source engine games. 60 fps with vsync and 60 fps with a framelimiter are a completely different feeling. And the negligible performance difference from 8xAF shows that even cards 20 years ago could do that with basically no performance difference but great visual improvements. The results in Colin McRae 2005 show that even back in the day a midrange card could hit almost constantly 30 fps in a very high resolution, 45 fps in medium settings at a lower, but common resolution. That means a RTX 4060 should be able to run any 2023 game at stable 40+ in 1080p It would be interesting to test that SM3 testing at a lower resolution. There are some improvements at 1600x1200, but that difference might be larger at higher framerates. Considering Far Cry at 1600x1200 runs into a VRAM limit, would a 256 MB 6600 run better? Totally agree with San Andreas. 30 fps feel perfectly at home. As long as it manages to stay there when things are happening, I'd be perfectly fine with it, because on PS2 it can't hold 30 fps with explosions. Consdiering texture filtering runs on the TMUs and not on the PS, VS or ROPs, there are probably situations where it is possible to use AF pretty much for free.
Jedi Outcast looks great, but we also have the Star Trek: Elite Force games, they also look great. To be honest I did not try a framelimiter for Underground, I might give it a try, 10x for the tip...and yeah, vsync sucks! Unfortunately I did very limited testing for SM 3.0, I'm planning to do a review for the GF 6800, I will go into more details then.
Played Jedi Knight 2 on a 23" Diamond SuperBright CRT, at 1600x1200 on a Ti4800SE. Looked incredible back in the day, but attrocious hit detection. The engine held up, but the design of the game was lacking. I never finished it. Elite Forces was a MUCH better game, despite not looking as good. Quake III engine never let anyone down, from COD to Heavy Metal FAKK to Alice. Great engine.
Not a bad card. Got one last year for the first time and tried it out, and it actually did an okay job. Comparable to a Radeon 9700 perhaps, at least in some titles. Definitely usuable back in the day. But nobody would ever have considered one because the 6600GT was just insane for the price - maybe the best GPU Nvidia ever made. That thing almost matched a 6800 and did it significantly less money, even beat a 6800XT (stock) at lower resolutions in some games. Making the 6600 a non issue, unless you had "x" amount of money and in no way could stretch to a 6600GT. Still - the vanilla 6600 could game, and was probably a worthy upgrade to DX9 compatibility for games like Far Cry and HalfLife 2 and for the newer OPenGL games like Riddick + Doom 3. I tested mine with Sin Episodes and it was totally playable and looked good, but the card did choke a little in COD 2 I had to turn down a lot of settings and run in DX8 mode. That's the issue with the 6600 really. Too fast for DX7 era stuff where a 4200Ti / 4400Ti is easily enough, and too slow for the DX9 transition where you needed a 6600GT / Radeon X800 series in order to turn on all that eyecandy.
If I would have to make a list of the best nVidia GPUs ever I would probably use: - Geforce 4 Ti 4200, just a great value card that performed well enough to compete even with budget cards 2 generations later - Geforce 6600 GT, priced right, with great performance. - Geforce 7300 GT, a bit of personal nostalgia here. A card with an entry level name, but a cut down midrange chip that, when overclocked, could come very close to actual midrange cards. - Geforce 8800 GT. A highend card for a midrange price. It basically destroyed any reason to by something like a 8600 GTS and easily demolished the HD 3870 - Geforce GTX 460 1 GB, a great card right where it needed to be. - Geforce GTX 660 Ti, midrange price with performance incredibly close to the GTX 670 - Geforce GTX 1080 TI, similar to the 8800 GT, a card that aged incredibly well and is even viable (with managed expectations) today in 2024 Now for the worst cards: - Geforce 4 MX, basically all of them. Outdated API support. But the MX 460 at least had decent performance. - Geforce FX 5200, especially the 64 bit version was just too slow for any use. The FX 5200 Ultra is a much better card, and Radeon alternatives were around that were faster and cheaper. - Geforce 8400 GS, the name suggests a midrange card, but performance was entry level. The updated later revisions were a bit better, but the performance difference between the 8400/8500 cards and the 8600 cards was just too big. - Geforce GT 710, why does this card even exist. It's basically the same thing that they used 3 generations earlier, just with a new name. Clearly a rebranding so that OEMs can claim to have new hardware. The GT730 gets a pass, especially the GDDR5 version, that did adequate for it's price. - Geforce GT 1030 DDR4, the performance difference to the GDDR5 model is absolutely insane. A card that shouldn't exist, considering cards like the GT 710 and 730 were available at the same time. The GDDR5 model is a fine budget card. Both lists are sorted by age, not by ranking.
@@HappyBeezerStudios You're basically listing the "bad" cards as all the bottom end media centre type cards. Not really a fair list. Of course they sucked at gaming, they cost from £50-70. Even going back to Fermi days, entry level GPU's for gaming started around £140, cards like the GTX460 or the HD 4670, for example. Nobody with a sane mind buys a GT430 or HD 4350 to play GAMES, do they? Your top list makes more sense, but the 6600GT blasts the others out of the water, if you consider price to performance. The 4200Ti came close as it usually overclocked to 4400 levels easily, but out of the box it cost 40% more than a 6600GT did just a year later, while the 6600GT beat it soundly.
@@TheVanillatech the 8800 GT was probably the best price to performance card they ever put out. Released for $200 but came within single digit % to a then $400+ 8800 GTX and about 3 times the performance of a $150 8600 GTS. A high-end card of the actual generation for the price of a midrange card. Where is the $300 RTX 4090 SE? Because that would be a fitting equivalent. And obviously the bad cards are all lower end. and they all have a good reason to be bad besides just being slow. Some were just rebadged 3 generation old low end cards, other had misleading names that suggest a much better placement, or simply had much better cards available for practically the same price. Their price was pretty much dictated by manufacturing, packing and shipping cost. For higher placed bad cards, the RTX 4060 probably takes a place, as does the 2080, both were slower than the cheaper card of the previous generation. And the 8600 GT, it didn't perform like a midrange card. The 9600 GT was much better. The FX 5500 was also an odd one, it was not a midrange card, it was entry level, slower than the 5200 Ultra.
@@HappyBeezerStudiosYou can't say the 8800GT was great cos it smashed the 8600GT, the 8600GT was TERRIBLE, in every way. 8800GT was a stellar card, it played Crysis for £230 and lasted a good 2.5-3 years as a capable card, you could skip the 9XXX series and possibly even the GTX 2XX series (Nvidia were simply rebranding and sandbagging back then anyway). BUT it doesn't beat the 6600GT *at all* in terms of value in the mid range. 6600GT was £140 pretty much from the outset, and was FASTER than the £200 6800XT in lower resolutions, only losing out at 1024x768 or higher. The 6600GT blasted everything else out the water, from ATI and also from Nvidias own cards. And although SLI wasn't reliable most of the time, two 6600GT's for £280 was beating the 6800Ultra by a serious margin in many games. 8800GT was good, but it wasn't no 6600GT. The HD3870 cost just a fraction more than a 8800GT, and pretty much matched it all the way (beat it at QHD resolutions too). The 6600GT was in a league of it's own, bro. Trust me. I did the maths many times. (PS. FX5500 was basically an office card. Great RAMDAC so nice image quality, but I think DELL bought 80% of the entire SKU to stick in their office machines .... not really good for gaming, and yeah and oddly named card.).
The 8800 GT is a legend of a card. At the time it released for $200 and competed with the $400+ 8800 GTX. It also had about three times the performance of the then $150 8600 GTS In modern terms it would be as if nVidia released a RTX 4080 SE with the performance of a 4070 Ti super for the current retail price of a 4060 Ti The only real issue was the cooling. A tiny single slot cooler with a tiny fan. I had one, and it ran with regular gaming load on 103°C. Later i took off the metal shroud from the cooler and strapped a pair of 80 mm fans to it. Dropped temps by 30°C!
ha ha, super suportul pt vaza :) Midnight...am gasit ABit BP6, cea care suporta 2x CPU Celeron Mendocino, daca o stii. Normal, "porneste" in sensul ca se invart vent-urile, CPU, NB se incalzesc, dar nu bipaie, nu apare imagine video nici pe PCI nici pe AGP...e cam dusa, din pacate. Am banuit din mom in care am vazut ca nu mai are cooler-ul pe NB. Oare stii pe cineva care ar putea s-o repare? :) E prea petioasa ca s-o pun tablou :)
I feel like the RTCW differences might be from PCIe. It has double the bandwith of AGP 8x, so maybe the game streams lots of textures to and from the card. If you manage to get the same card on both interfaces it would be worth making a comparison. If both cards are otherwise identical (pipes, clocks, memory, etc) and the PCIe card is noticably faster, that would be a good indicator. There is an older test on guru3d with 6800 GT cards where performance is closer together, but that is a 256 MB card. So it might be situation where a Geforce with 128 MB can benefit from shared memory, while Radeons and 256 MB cards don:t
I had a "silent" gigabyte 6600 back in the day. I was stupid and trusty and ended up with one of the worst GPUs I ever had. Not because 6600 itself was bad (it was decent), but because "silent cooling" ended up resulting in temperatures over 100 degrees C and heavy artifacting.
i had a 6600gt AGP think it was a winfast. with a socket 478 2.6ghz celeron and 512mb ram. it ran Counter strike source, day of defeat source and gmod playable. you should add a source engine game to the test line up
pixel shader 3 was mostly relevant for late high end gpus, for example my ati x800xt poorly aged compared to its competitor geforce 6800 ultra, where x800xt was clearly powerful enough to run 2007-2009 games but commonly refused to launch them due to pixel shader 2 limitation.
X800 series was FINE, compared to the 6800 series. What you spitting? The X850 XT PE was the DADDY, easily more attractive than the 6800 Ultra. And the 6600GT had enough grunt to run PS3 titles smoothly such as Far Cry. There were just a handful of game out that particular generation that supported (but didn't REQUIRE) pixel shader 3. By the time more games were released, Radeon had their X1900 series on shelves. The X1950XTX destroying the 7950GT too! Don't hate on Radeon
@@TheVanillatech x800 had amazing horsepower but outdated pixelshader, just because new gpu was available doesnt excuse that. my x800xt was great until it wasnt and the reality was that new gen consoles made that gpu obsolete, again due to pixel shader version. in a hindsight, nvidia 6800 ultra was far better choice, unless you changed your gpus like socks.
@@h1tzzYT Like I said, hardly ANY games at all used Pixel Shader 3.0 at the time of the X800 series shelf life! So it's a non issue. It's like arguing that the 4060Ti 16GB, a $500 card, is "better" than a $500 7800XT, despite the AMD card being 30-50% faster in games, just because RTX performance is better on the Geforce! When less than 2% of ALL games released on STEAM in the last 12 months even have RT implementation, it's a non issue! I had a 6800XT, unlocked to a full 6800GT with RivaTuner. Amazing card, loved it. But I'd have traded it in a heartbeat for my friends FAR BETTER, and FASTER, X850 XT! That card was the best around at the time, dominated even the 6800 Ultra. Facts.
@@TheVanillatech nope, many games from 2007-2008 started requiring pixel shader 3 and x800 series/nvidia 6000 series wasnt THAT old at that time, higher end models clearly had plenty of raw horsepower to run newer games at decent framerates, because games which still required only pixel shader 2 ran at max settings on my x800xt no problem, but shader model requirements completely flipped it on its head. I can name a few games for you: mass effect 1 (2007), GTA4 (2008), bioshock (2007), assassin's creed (2008), spider-man 3 (2007), DiRT (2007), dead space (2008) and many many more.
@@MidnightGeek99 Ok, then what version of Windows? I've tried to use all versions of Afterburner, which I found, with XP, but it seems to be not working with XP. Do u use Win7?
Some pieces of 6600 non GT had pretty good VRAM OC possibilities, but GT version was stillmuch better. But seriously, in 2004, most of people I know had something much worse than 6600, so yes, it was pretty ok GPU in 2004 I guess. 🙂
And whats about 6200? Everyone complained about fx5200, but no one even mentioned 6200? :D I don`t remember this one. But I remember that gf7300 GT was not bad!
Firstly, In 2004 ATI had a Radeon x700pro 256 for 150£. That card was a direct competitor to the GeForce 6600LE. Secondly, Nvidia NV3X, NV4X, and G7X are rubbish compared to ATI.
Generally ok, but next time you could also test CPU bottleneck in higher resolutions, not only 800x600. In case like that- maybe not 1600x1200, because it would be unrealistic for this card, but 1024x768 and 1280x1024 (or x960). You could also test it in more games than just one relatively new and CPU intensive. Of course, I understand that it would be more work, so I don't insist. Thanks in advance.
I had this thing. But I never used it because I was a dumbass and didn't know you had to feed it power. So I just put it in a drawer and got the 6200 which didn't need that. smh
@@Pidalin I had an AGP 7600 GS by Gainward and tested it out one day while I was bored. That thing overclocked like you wouldn't BELIEVE! I was in shock at how far I could push it, and the literaly linear improvement in framerates from the monster OC turned it into an entirely different tier of card!
@@TheVanillatech I believe, it's the same core as 7600GT, but underclocked, when you had better piece, you was able to overclock core back to GT values
You're such a hidden gem on here. So easy to watch and listen to. Full of info, highly entertaining, great editing, all while making it look easy even though it's a lot of work. Love your accent and humour as well. I'm from the UK and have no accent! Really appreciate your efforts and time 👍
Thank you very much :)
I love you accent, your witty remarks, and your humor, man!! I find all your videos interesting and I've been binge-watching them recently! Keep making more!!
Thank you very much, I appreciate it!
President Iliescu accent
Hey Sir, some months ago we talked on discord chat about the GeForce 6600 pcie (non gt). Using a win 7 driver, I got it running on win 10 and played the current version of lotr (lord of the rings) online with it. Glad to see you making a video about it, but you did not mention its usability in Windows 7 and 10.
Hey! It's great, I never got to use such older video cards in Win 10, only CPUs, but maybe in the future :)
I did not mention usability in newer OSes, because...time.
I really admire this guy, still releasing videos regularly despite having quite low views and subscriptions.
Thanks?
Thanks so much for that video as I have the club GT version for my xp build which I'll hopefully finish this weekend. I also took your advice & got a 7900gt to go with the e8600. I also bought a 8400gs for £5, I'm still not sure why I bought it, I think I have a problem. Thanks again for your efforts & your playlists are still my go to sleep aid to stop my mind wandering. Thanks
I'm glad you found a 7900 GT, those were good cards!
Such a coincidence, I was watching your 6600 GT video earlier today.
I think I need to redo that video, I don't like it anymore!
The 6600 GT was an amazing step up from the normal 6600, with the only difference being in clock speeds. Same amount of pipelines, same memory bus, only clocks.
Don't forget DDR vs GDDR3, GDDR sounds cooler :)
Yep. I had a Inno3D 6600GT in the late year 2004 and with some OC its almost equal to a 6800 performance. I used with a AMD Duron 600 running at 1000Mhz... i loved this setup back then. In 1280x1024 it was enough in every game.
Just me who loves his accent?
I don't!
❤
Yaz
@@MidnightGeek99 you have such an amazing accent dude!
I actually had this card, bought it around 2006 or so, my PC had 3MB VRAM and a CPU of 900Mhz, and about 256MB RAM, so going from 3MB VRAM To 128MB VRAM was... crazy... I played tons and tons and tons of games... but was still behind the graphics cards of that time.. and still dreamt to play the newest games.. but at least I could play GTA Vice City and later on GTA San Andreas but had more RAM at that time, about 512MB RAM... I fried that PC when I tried to install 1GB RAM while the computer was on ( huge mistake ), I think I also had the wrong RAM.
After begging my parents for years and years they bought me another PC but was weak and once again I was behind those times... and could only play older games but I loved and love older games, Age of Empire, Rise of nations and many other.. Someone give me a time machine to relive those gold times!!!!
What an upgrade :) I bought one in 2005 myself, upgraded my GF2 MX 400.
I remember getting a custom Pentium 4 build with a 5 series NVidia card for free from a friend. It was so nice compared to my slow Celeron Dell office pc lol. Later got a 6600 and maxed out ram but still wanted a bit more as the 360 and ps3 were already out. I found a used C2D e6600 custom build and later bought an evga 9800gt sc. Wow, what an upgrade! This build served me well for a while. An upgrade to a q6600 would've given it a bit more life but I think the upgrade to x58 was more worth it. Barely upgraded from that build a year ago :D Cheers!
The 9800 GT cards were very good, fast and not as prone to breaking as the 8800 cards.
Love the test bench - looks mighty 💪
Hehe, thanks!
Hello, the 6xxx series are such a great value for fast Athlons XP and Pentium 4 AGP systems, quite amazed about that huge CPU bottleneck, so thanks for adding that test.
What I love about these passive cards is how easy is to add a 80mm fan and end up with a bulletproof cool and quiet solution but without a lot of fresh air they get toasted. 😂
Yes, the AGP versions of this card are very good, and also compatible with 98.
Wish i could remember what PC i had 21 years ago, but i do remember playing Underground at around 20-30 fps. Back then, the fact that the game ran, it was all we needed. Framerate over 30 fps was just a bonus :)
You surely had an Athlon XP with GeForce FX 5200 :)
@@MidnightGeek99 :) Never had a FX5200, i did have an Athlon later. But i think i was on some kind of a Duron up until 2004-2005 when i got an Athlon. The GPU is a blank, wish i have kept the receipts from back then :D
I love this card... I have two agp geforce 6600gt in my collection. Honestly it is overpowered for most of the old platforms with AGP slot. The PCIE version is real wonder of value, and it supports SLI !!
An AGP version of this 6600 is amazing, because you can use it in Windows 98.
I have this card sitting on my desk! Still works, though if i wanted to use it i had to point some kind of fan on the cooler, otherwise it would just shut down under load, i did mess around with it recently and it does almost manage to run TF2 at 10 or 20 fps with the lowest possible settings (without modifying the game files) at 800x600 however it does freeze every once in a while, that 256mb vram is definitely not helping it ;)
With a cooler and overclocked from 300/500 mhz up to 450/600 mhz, it would have had an excellent price/performance ratio for 2004 and 2005.
Yes :) The good part is that you can add a cooler, it's easy.
What a great idea it was to decouple the ROPs from the Shaders/TMU's. It changed the game forever. I didn't own one at the time, but we sold tons of them at the shop I worked at.
EDIT: Does your 775dual-VSTA have the AGP bug? Does it list zero megabytes in AGP memory in 3dmark2001? I sold my 775dual-VSTA because I was tired of compromises. If you don't want the bug you have to run the earliest bios version and forgo any benefits of later bios updates. I'm done with VIA chipsets passed KT600.
You are correct, VIA chipsets past KT600 are not that great. I did not pay attention to 3DMark 2001, I'll be looking at it next time, thanks.
VIA were at the top of their game with the later revisions of the KT266 and the KT333, KT400 and KT600. If you want to run socket A and something else than nForce, those are great.
There are also some options for socket 370 whith support 1.5 GB RAM, unlike i815, which is limited to 512 MB. A bit slower than the Intel chipsets, but a fine alternative if you want more memory or ISA slots.
Back in 2005, I built my first PC to play Counter-Strike 1.6 and Day of Defeat 1.3 on windows XP with an Athlon 64 3000+ s754 (stock cooler), I think I had 1 or 2 1GB DDR-400 sticks of RAM (3x1GB later on), an ASUS K8N motherboard and a XFX Geforce 6200 256MB DDR2 (64bit memory bus width) AGP x8 graphics card with passive cooling. Not long after I added the cheapest aftermarket active cooler I could find for the geforce 6200. It sucked ass but it prevented overheating and throttling. All that on the cheapest case, power supply and HDD I could get. Do you think you have what it takes to endure that pain?
In 2005 I was still rocking an Athlon XP 1700+ and GeForce2 MX 400 :)
Man... these games are my childhood...
Mine also!
17:17 Nice trick there.
Yeah, this card gets pretty hot. It easily reaches 80C for me in Thief Deadly Shadows (you should add it to your benchmark list since it's such a demanding game for its time) and when I tried Dark Messiah with HDR on, it went up even more 🔥
It got to 80 for me also, I don't like using it at those temperatures, I'm afraid it's going to die!
@@MidnightGeek99 Sooner or later, all GPUs die.
Speaking of dying GPUs, had a GTX 460 die on me while playing STALKER SoC. Thing with STALKER games is that their in-game vsync doesn't work, so the fps is uncapped. You can get something like 1000 fps in menus and a few hundred while playing which causes stress and most importantly heat on the GPU.
Since then I always try to cap the fps to my monitor's refresh rate :)
Hi MNG,
Love your chanel!
I have your same condition 😂
Maybe you can make some video on 2000 obscure great games ?
We all have our favorites some like robot arena 2, age of mythology, neighbors from hell ecc
Much love from your Serbian neighbor! 😊
Hey, thanks!
Obscure 2000 games? Sure, why not, I have a plan to cover some less known games.
There are certain aftermarket passive coolers that'd be beefy enough to properly handle the 6600. For example the Thermaltake Schooner...if modern and high quality thermal paste is used during assembly. FX 5700 temperatures were fine (I could withstand touching the copper).
I'd only get a 6600 now, if it was by buying the original 6200 and unlocking it. 6600's core clock is the same as the standard 2D clock of the (AGP) 6600GT BTW.
Or just add a fan on top :)
@@MidnightGeek99 exactly. It already has a large passive cooler, all it needs is airflow to get the warm air away from it.
Oh, I see a Zalman CNPS 9500 and Corsair XMS2 memory on a Gigabyte P35 board. Good memories.
My RAM died at some point, but the board is still powering my midrange retro system and the Zalman had to make room for a better cooler at some point.
And a passive card? with a big cooler? That basically screams for a case fan and overclocking.
Oh, and I would probably switch the board/CPU combo. The P35 supports FSB1333 chips just fine, while the 775Dual-VSTA only has official support for FSB1066. So if you intend to oc, both boards should be no limiting factor for the other CPU.
A well running Gigabyte P35 should be able to do 500 MHz/2000 MT FSB just fine, which could bring the E6600 to 4.5 GHz. But 3.6 GHz on 9x 400 is probably a smoother option with less need for cooling.
I'm tamed in regards to overclocking, capacitors don't like that :)
Jedi Outcast shows how great the Quake 3 engine looks even today.
Sure, texture resolution is lower, models have fewer polygons. But the aesthetics are just amazing.
I would probably use 2x AA on 1024x768 or 1152x864. I notice jaggies and flickering a lot, so smoothing those edges are pretty much a must have for me, but especially on a CRT lowering resolution frees some GPU performance without impacting visual quality too much. And on 4:3 aimed games of the time, I would absolutely prefer a cRT.
For NFS Underground I would probably just use a 60 fps limiter. Since you're already using Afterburner + RTSS, the software for that is already installed. That way the game won't speed up when running >60 fps, but the stutter and input lag are gone. Something I notice very much in source engine games. 60 fps with vsync and 60 fps with a framelimiter are a completely different feeling.
And the negligible performance difference from 8xAF shows that even cards 20 years ago could do that with basically no performance difference but great visual improvements.
The results in Colin McRae 2005 show that even back in the day a midrange card could hit almost constantly 30 fps in a very high resolution, 45 fps in medium settings at a lower, but common resolution. That means a RTX 4060 should be able to run any 2023 game at stable 40+ in 1080p
It would be interesting to test that SM3 testing at a lower resolution. There are some improvements at 1600x1200, but that difference might be larger at higher framerates.
Considering Far Cry at 1600x1200 runs into a VRAM limit, would a 256 MB 6600 run better?
Totally agree with San Andreas. 30 fps feel perfectly at home. As long as it manages to stay there when things are happening, I'd be perfectly fine with it, because on PS2 it can't hold 30 fps with explosions.
Consdiering texture filtering runs on the TMUs and not on the PS, VS or ROPs, there are probably situations where it is possible to use AF pretty much for free.
Jedi Outcast looks great, but we also have the Star Trek: Elite Force games, they also look great.
To be honest I did not try a framelimiter for Underground, I might give it a try, 10x for the tip...and yeah, vsync sucks!
Unfortunately I did very limited testing for SM 3.0, I'm planning to do a review for the GF 6800, I will go into more details then.
Played Jedi Knight 2 on a 23" Diamond SuperBright CRT, at 1600x1200 on a Ti4800SE. Looked incredible back in the day, but attrocious hit detection. The engine held up, but the design of the game was lacking. I never finished it.
Elite Forces was a MUCH better game, despite not looking as good.
Quake III engine never let anyone down, from COD to Heavy Metal FAKK to Alice. Great engine.
Not a bad card. Got one last year for the first time and tried it out, and it actually did an okay job. Comparable to a Radeon 9700 perhaps, at least in some titles. Definitely usuable back in the day. But nobody would ever have considered one because the 6600GT was just insane for the price - maybe the best GPU Nvidia ever made. That thing almost matched a 6800 and did it significantly less money, even beat a 6800XT (stock) at lower resolutions in some games. Making the 6600 a non issue, unless you had "x" amount of money and in no way could stretch to a 6600GT.
Still - the vanilla 6600 could game, and was probably a worthy upgrade to DX9 compatibility for games like Far Cry and HalfLife 2 and for the newer OPenGL games like Riddick + Doom 3.
I tested mine with Sin Episodes and it was totally playable and looked good, but the card did choke a little in COD 2 I had to turn down a lot of settings and run in DX8 mode.
That's the issue with the 6600 really. Too fast for DX7 era stuff where a 4200Ti / 4400Ti is easily enough, and too slow for the DX9 transition where you needed a 6600GT / Radeon X800 series in order to turn on all that eyecandy.
That's right, the card was there, in the middle, but I had one, and it served me well.
If I would have to make a list of the best nVidia GPUs ever I would probably use:
- Geforce 4 Ti 4200, just a great value card that performed well enough to compete even with budget cards 2 generations later
- Geforce 6600 GT, priced right, with great performance.
- Geforce 7300 GT, a bit of personal nostalgia here. A card with an entry level name, but a cut down midrange chip that, when overclocked, could come very close to actual midrange cards.
- Geforce 8800 GT. A highend card for a midrange price. It basically destroyed any reason to by something like a 8600 GTS and easily demolished the HD 3870
- Geforce GTX 460 1 GB, a great card right where it needed to be.
- Geforce GTX 660 Ti, midrange price with performance incredibly close to the GTX 670
- Geforce GTX 1080 TI, similar to the 8800 GT, a card that aged incredibly well and is even viable (with managed expectations) today in 2024
Now for the worst cards:
- Geforce 4 MX, basically all of them. Outdated API support. But the MX 460 at least had decent performance.
- Geforce FX 5200, especially the 64 bit version was just too slow for any use. The FX 5200 Ultra is a much better card, and Radeon alternatives were around that were faster and cheaper.
- Geforce 8400 GS, the name suggests a midrange card, but performance was entry level. The updated later revisions were a bit better, but the performance difference between the 8400/8500 cards and the 8600 cards was just too big.
- Geforce GT 710, why does this card even exist. It's basically the same thing that they used 3 generations earlier, just with a new name. Clearly a rebranding so that OEMs can claim to have new hardware. The GT730 gets a pass, especially the GDDR5 version, that did adequate for it's price.
- Geforce GT 1030 DDR4, the performance difference to the GDDR5 model is absolutely insane. A card that shouldn't exist, considering cards like the GT 710 and 730 were available at the same time. The GDDR5 model is a fine budget card.
Both lists are sorted by age, not by ranking.
@@HappyBeezerStudios You're basically listing the "bad" cards as all the bottom end media centre type cards. Not really a fair list. Of course they sucked at gaming, they cost from £50-70. Even going back to Fermi days, entry level GPU's for gaming started around £140, cards like the GTX460 or the HD 4670, for example. Nobody with a sane mind buys a GT430 or HD 4350 to play GAMES, do they?
Your top list makes more sense, but the 6600GT blasts the others out of the water, if you consider price to performance. The 4200Ti came close as it usually overclocked to 4400 levels easily, but out of the box it cost 40% more than a 6600GT did just a year later, while the 6600GT beat it soundly.
@@TheVanillatech the 8800 GT was probably the best price to performance card they ever put out.
Released for $200 but came within single digit % to a then $400+ 8800 GTX and about 3 times the performance of a $150 8600 GTS. A high-end card of the actual generation for the price of a midrange card. Where is the $300 RTX 4090 SE? Because that would be a fitting equivalent.
And obviously the bad cards are all lower end. and they all have a good reason to be bad besides just being slow. Some were just rebadged 3 generation old low end cards, other had misleading names that suggest a much better placement, or simply had much better cards available for practically the same price. Their price was pretty much dictated by manufacturing, packing and shipping cost.
For higher placed bad cards, the RTX 4060 probably takes a place, as does the 2080, both were slower than the cheaper card of the previous generation.
And the 8600 GT, it didn't perform like a midrange card. The 9600 GT was much better.
The FX 5500 was also an odd one, it was not a midrange card, it was entry level, slower than the 5200 Ultra.
@@HappyBeezerStudiosYou can't say the 8800GT was great cos it smashed the 8600GT, the 8600GT was TERRIBLE, in every way.
8800GT was a stellar card, it played Crysis for £230 and lasted a good 2.5-3 years as a capable card, you could skip the 9XXX series and possibly even the GTX 2XX series (Nvidia were simply rebranding and sandbagging back then anyway). BUT it doesn't beat the 6600GT *at all* in terms of value in the mid range.
6600GT was £140 pretty much from the outset, and was FASTER than the £200 6800XT in lower resolutions, only losing out at 1024x768 or higher. The 6600GT blasted everything else out the water, from ATI and also from Nvidias own cards. And although SLI wasn't reliable most of the time, two 6600GT's for £280 was beating the 6800Ultra by a serious margin in many games.
8800GT was good, but it wasn't no 6600GT. The HD3870 cost just a fraction more than a 8800GT, and pretty much matched it all the way (beat it at QHD resolutions too). The 6600GT was in a league of it's own, bro. Trust me. I did the maths many times.
(PS. FX5500 was basically an office card. Great RAMDAC so nice image quality, but I think DELL bought 80% of the entire SKU to stick in their office machines .... not really good for gaming, and yeah and oddly named card.).
Nu am auzit încă de placa acesta video acum o văd la tine pe canalul se youtube
Chiar nu am știut că există această placă video 😮😮😮
Uite ca stii acum :)
can you do next a video with the gt8800 (my first gpu)
I don't have a 8800 GT :) But, I'm looking for one, and I will be reviewing it, sure.
i have one:))))@@MidnightGeek99
The 8800 GT is a legend of a card.
At the time it released for $200 and competed with the $400+ 8800 GTX. It also had about three times the performance of the then $150 8600 GTS
In modern terms it would be as if nVidia released a RTX 4080 SE with the performance of a 4070 Ti super for the current retail price of a 4060 Ti
The only real issue was the cooling. A tiny single slot cooler with a tiny fan. I had one, and it ran with regular gaming load on 103°C. Later i took off the metal shroud from the cooler and strapped a pair of 80 mm fans to it. Dropped temps by 30°C!
Pulled up my benchmark spreadsheet for fun, in 2004 my 6800GT gave me 5490 in 3DMark05, on an Athlon 64 3200+
The 6800 GT is miles ahead of the 6600.
There was some custom 6600 versions with DDR3 memory. Those were a fantastic budget buy if you could find them.
:O I don't remember them...DDR3 means memory OC, for sure.
ha ha, super suportul pt vaza :) Midnight...am gasit ABit BP6, cea care suporta 2x CPU Celeron Mendocino, daca o stii. Normal, "porneste" in sensul ca se invart vent-urile, CPU, NB se incalzesc, dar nu bipaie, nu apare imagine video nici pe PCI nici pe AGP...e cam dusa, din pacate. Am banuit din mom in care am vazut ca nu mai are cooler-ul pe NB. Oare stii pe cineva care ar putea s-o repare? :) E prea petioasa ca s-o pun tablou :)
Din nefericire nu am pe nimeni care sa repare, am si eu o gramada de componente ce trebuie reparate.
I feel like the RTCW differences might be from PCIe. It has double the bandwith of AGP 8x, so maybe the game streams lots of textures to and from the card. If you manage to get the same card on both interfaces it would be worth making a comparison. If both cards are otherwise identical (pipes, clocks, memory, etc) and the PCIe card is noticably faster, that would be a good indicator.
There is an older test on guru3d with 6800 GT cards where performance is closer together, but that is a 256 MB card. So it might be situation where a Geforce with 128 MB can benefit from shared memory, while Radeons and 256 MB cards don:t
Yeah, some tests have weird results, but it takes time to discover what exactly went wrong.
I've got the 256mb AGP version of this card. It's cooled by gigabytes heatpipes and heatsinks. Silent one as well that also get's hot lol.
The Gigabyte version should be better cooled than this, this Asus simply needs some air blown into it.
@@MidnightGeek99 It get's hot to the point you can't touch it lol. It's the AGP version with a pcie bridge chip on the back side of it.
I had a "silent" gigabyte 6600 back in the day. I was stupid and trusty and ended up with one of the worst GPUs I ever had. Not because 6600 itself was bad (it was decent), but because "silent cooling" ended up resulting in temperatures over 100 degrees C and heavy artifacting.
Yeah, like I was saying :))
SM3 and WDDM card was a fing spaceship back then
The beginning of SM 3.0...
finally you added GTA sanandreas in the testing series
I've used SA in other videos, but yeah, not that often.
i had a 6600gt AGP think it was a winfast. with a socket 478 2.6ghz celeron and 512mb ram. it ran Counter strike source, day of defeat source and gmod playable. you should add a source engine game to the test line up
I should, thanks for the tip!
I think people would be using this card to mostly play at 1024x768 at the time, and this would make it easier for it to keep the higher settings
Yes, maybe, but in 2004 you could push to 1280x1024, especially if you had an LCD.
pixel shader 3 was mostly relevant for late high end gpus, for example my ati x800xt poorly aged compared to its competitor geforce 6800 ultra, where x800xt was clearly powerful enough to run 2007-2009 games but commonly refused to launch them due to pixel shader 2 limitation.
That's right, if you were using x00 series cards later in 2010, you might have issues with some games, for example Bioshock.
X800 series was FINE, compared to the 6800 series. What you spitting? The X850 XT PE was the DADDY, easily more attractive than the 6800 Ultra.
And the 6600GT had enough grunt to run PS3 titles smoothly such as Far Cry.
There were just a handful of game out that particular generation that supported (but didn't REQUIRE) pixel shader 3. By the time more games were released, Radeon had their X1900 series on shelves. The X1950XTX destroying the 7950GT too!
Don't hate on Radeon
@@TheVanillatech x800 had amazing horsepower but outdated pixelshader, just because new gpu was available doesnt excuse that. my x800xt was great until it wasnt and the reality was that new gen consoles made that gpu obsolete, again due to pixel shader version. in a hindsight, nvidia 6800 ultra was far better choice, unless you changed your gpus like socks.
@@h1tzzYT Like I said, hardly ANY games at all used Pixel Shader 3.0 at the time of the X800 series shelf life! So it's a non issue.
It's like arguing that the 4060Ti 16GB, a $500 card, is "better" than a $500 7800XT, despite the AMD card being 30-50% faster in games, just because RTX performance is better on the Geforce! When less than 2% of ALL games released on STEAM in the last 12 months even have RT implementation, it's a non issue!
I had a 6800XT, unlocked to a full 6800GT with RivaTuner. Amazing card, loved it. But I'd have traded it in a heartbeat for my friends FAR BETTER, and FASTER, X850 XT! That card was the best around at the time, dominated even the 6800 Ultra. Facts.
@@TheVanillatech nope, many games from 2007-2008 started requiring pixel shader 3 and x800 series/nvidia 6000 series wasnt THAT old at that time, higher end models clearly had plenty of raw horsepower to run newer games at decent framerates, because games which still required only pixel shader 2 ran at max settings on my x800xt no problem, but shader model requirements completely flipped it on its head.
I can name a few games for you: mass effect 1 (2007), GTA4 (2008), bioshock (2007), assassin's creed (2008), spider-man 3 (2007), DiRT (2007), dead space (2008) and many many more.
I bought one AGP to replace my ti4200 when Oblivion came out, the ti4200 was not able to display the game
I've also played Oblivion on a 6600, it run...ok :)
What version of Afterburner do u use with 6600?
4.63, with rivatuner statistics server 7.xx
@@MidnightGeek99 Ok, then what version of Windows? I've tried to use all versions of Afterburner, which I found, with XP, but it seems to be not working with XP. Do u use Win7?
@@mistersanya6066 what exactly is not working? the overlay display, of Afterburner does not start at all?
"more on that later" but you never get back to it xD
It's a trap!!!
is it?@@MidnightGeek99
Some pieces of 6600 non GT had pretty good VRAM OC possibilities, but GT version was stillmuch better. But seriously, in 2004, most of people I know had something much worse than 6600, so yes, it was pretty ok GPU in 2004 I guess. 🙂
By itself was ok, the problem was with the 6600 GT, because was a lot faster.
You can do this with nvidia 6800 GT?
I don't have one, they are hard to find :(
And whats about 6200? Everyone complained about fx5200, but no one even mentioned 6200? :D I don`t remember this one. But I remember that gf7300 GT was not bad!
The 128-bit 6200 were ok, decent.
Oh,the silent coolers.
Almost always it meant dead gpu....
Hehe, yeah.
Firstly, In 2004 ATI had a Radeon x700pro 256 for 150£. That card was a direct competitor to the GeForce 6600LE. Secondly, Nvidia NV3X, NV4X, and G7X are rubbish compared to ATI.
The X700 Pro was very good, but unfortunately they were rare.
Generally ok, but next time you could also test CPU bottleneck in higher resolutions, not only 800x600. In case like that- maybe not 1600x1200, because it would be unrealistic for this card, but 1024x768 and 1280x1024 (or x960). You could also test it in more games than just one relatively new and CPU intensive. Of course, I understand that it would be more work, so I don't insist. Thanks in advance.
I did, but I forgot to add those tests as well, here you go:
Far Cry 640x480 minimum: 182 C2D / 107 P4
Far Cry 800x600 maximum: 64 C2D / 53 P4
Doom 3 640x480 low: 143 C2D / 81 P4
Doom 3 800x600 medium: 79 C2D / 72 P4
Doom 3 1600x1200 very high: 22 C2D / 23 P4
@@MidnightGeek99 ok, thanks
I had this thing. But I never used it because I was a dumbass and didn't know you had to feed it power. So I just put it in a drawer and got the 6200 which didn't need that. smh
:)))
😂 I owned this very card. I was poor at the time. It did run early wow
I owned a 6600 (AGP) from 2005 to 2007ish, it was very good for me :)
Like
Don't worry the GeForce 7600 GS is also a hot Card with passive cooling 😂😂
7600 GS has much better chip in case of energy efficiency, you can cool it with wet finger. 6600 was much hotter.
@@Pidalin I had an AGP 7600 GS by Gainward and tested it out one day while I was bored. That thing overclocked like you wouldn't BELIEVE! I was in shock at how far I could push it, and the literaly linear improvement in framerates from the monster OC turned it into an entirely different tier of card!
@@Pidalinis in my collection the Asus 7600 GS passive cooling and in Game temperature are 75°C without case fan... 😊
@@Agoz8375 which is ok, with 6600GT and stock cooler, you can have even 95°C in game 😀
@@TheVanillatech I believe, it's the same core as 7600GT, but underclocked, when you had better piece, you was able to overclock core back to GT values
Tôi có 1 thẻ gigabyte gefocre 6600 128mb agp với tốc độ core 400mhz vram 500 . Chắc chắn là tốt hơn nguyên bản 300/500
Yes, the default 300 MHz is not great for a 2004 card!
0:45 that is not nice 😂
No but it's useful :)