I remember how everyone was buying Geforce 6800 LE and then trough bios-mods they unlock processing units. It had 8 pixel shaders enabled by standard, but some people managed to get 12 units and some even got 16 and trough overclocking made it practically 6800 GT with slightly slower memory.
This Actually mattered a lot. My brother was able to game with one of these until upgrading to a gtx 260 in early 2009 due to sm3 support. A x800 would be obsolete much earlier due to many games from as early as 2007 requiring sm3. That 260 lasted him until 2015 so the GeForce 6 could have had you set up for a path of minimal upgrades. A x800 purchase would have certainly had him upgrading to a 8800gt instead over a year earlier and that certainly wouldn’t have lasted him as long.
If he had the foresight to hold out til the 5870 he could have realistically gone 6800 -> 5870 -> 980 and still be gaming on his third gpu since 2004. Crazy to think about
@@sgdude1337 Sometimes I wonder how well I could optimize my PC upgrade history, knowing what I know now about the hardware and what software/games I ended up actually playing.
My first GPU was a Sapphire Radeon X800 and I sort of wish I'd kept it. Got a Sapphire Radeon HD 5850 Toxic after that, which I also regret not keeping. Probably the sickest looking GPU ever.
I once had a 6800 with 128MB, it served me well for quite some time and was a nice upgrade from my previous FX 5900 XT. Now I own a 6800 GT for one of my Retro PCs. It has to serve as replacement for my amazing Gainward 7800 GS 512MB AGP with unlocked pixel pipes that sadly began to show artifacts.
I have the AGP versions of both these cards, but in a twist of fate the memory configurations are reversed (6800 has 128mb and X800 has 256mb). I was able to unlock all the pipes on both cards for extra performance.
For me all my X800 cards leave the 6800AGP I have in the dust. They all have GDDR3 or 256MB+ The 6800Ultra is a bit faster than some, but the two platinum editions leave everything in the dust. Great cards.
Love this segment, and also I was able to recall what you say as you say along like it was yesterday. I love during this time, I actually bought 6800NU back then that capable of unlock to 16 pipes. Only bottleneck is 128MB of VRAM but it was fast. Nowadays I buy bucketloads of X800 cards because of one feature, ATi Truform :)
I just wanna say what a blessing it always is to see new content from you, I bought one of your "Maximum Rage" shirts, love it dude. Keep doing what you love, and me along with everyone else here will keep enjoying it. :)
Good video, Nathan. Around this time, I believe I had the X800 Pro Vivo, though looking back, I think my CPU was restricting its performance by a fair margin.
@PixelPipes I'm Soooooooooooo Glad to see you making gpu videos again Keep them coming and know that your contribution to this hobby/pastime is greatly appreciated !
I'd have added a little bit of overclocking to the comparison, to spice things a little bit; some times these middle-high-end cards have quite of room for improvement, especially if they use under-clocked memory modules ;)
@@PixelPipes 15% slower than GF 6600 Gt on my benchmark results with Atlon 3700+ 2GB Ram but still decent middle class card that day. It beats the 6600 non GT by far. The 6600 Gt is the clear winner against X700 Pro but the RAdeon X700 Pro was a bit cheaper. Ati had planed to bring the X700 Xt a higher clocked X700 Pro which are closer to the 6600Gt but the 6600GT still wins and the Production costs were to expensive for ATI so they have decided to skip this card and not bring them to the marked instead they have focued to the Radeon X800 non Pro which beats the 6600 GT and they have reduced the price so the X800 where the competitor of the 6600 GT althouth technically this card are equel then the Gefore 6800 (this card is not much faster than the 6600GT) 6800 was much more expensive for a little bit more performance than the 6600 GT. The 6600 GT had the best price-performance for the most gamers and this card was fast enough to handle 2004 and even 2005 games in 1280x Maximum details and Anti Aliasing.
I used to have a X800 AIW with digital TV tunes and video capture. only stopped using it as a second card when i moved to vista as there was no drivers for the tuner. I had a 6800 at one point and it was awful it only had 128mb of ddr1 so was painfully slow for a 6800. ditched that for a x1900XT AIW I think this was back when i was playing UT2004.
Back in the day, I bought a 6800XT AGP on a great deal, a freaky deal, just £25 more than a 6600GT. It also turned out to be A0 revsion PCB, and let me unlock all the masked vertex / pixel pipes with Rivatuner. Was a great card! Insane for what I paid for it! Had been set on a 6600GT but managed to get super lucky on eBuyer that day. Played Riddick Butcher Bay amazingly well. Now I have a ton of X800 cards. X800XL, X800GT, X800GTO, X800Pro.... all great cards!
@@PixelPipes Yeah I mean, I hoped - but I never expected to get that lucky. Read an article online about how it was possible, but only on certain boards. When it arrived, I was overjoyed. It was a leadtek model. Said "Rev. A0" on the corner of the board! First pass of 3Dmark 03 netted me around what a 6600GT got. But I unlocked just half the masked pipes, and it went up to 9000. Then I unlocked all of them, and scored over 11,000. No artifacts! But out of caution, I settled on leaving half of them locked. Performance was still great, and I didn't wanna stress my new purchase. Still plenty of power! For a little more than the cost of a 6600GT, the card I was about to buy, I ended up with essentially a 6800GT. Shoulda bought a lottery ticket that week! XD
@technooby220 Yeah the XT was the step down from the vanilla. Unlocking half the masked pipes brought me to, essentially, a vanilla 6800. But mine even let me unlock ALL the pipes. I did settle on just half though. Interestingly, the 6600GT beat the 6800XT in lower resolutions. It wasn't until you went up to 1280x1024 and beyond that the 6800 pulled away. But once I'd unlocked it, totally different tier of card. For just £25 more than a 6600GT! Saved a fortune really! At first I was reluctant to ask my Dad for the extra £25, things were tight back then, and I'd saved forever for a 6600GT (to replace my Geforce 4 Ti 4800SE). When I saw that insane offer on the 6800XT though, I had to visit the royal bank of Dad! I'm trying to remember, and I think the 6600GT was £145, and they had this 6800XT in stock for £169. At the same time, the top end cards like the 6800GT were listed for £230+ . Well out of my price range! But I got it in the end! Thanks to Rivatuner and my Dad! :D
@technooby220 Well, I originally bought a Geforce 4 Ti 4800SE because I'd been given a Dell Optiplex mid tower, with a Pentium 4 2.0Ghz, at a LAN party one time. It replaced my aging Duron 700. But it lacked a GPU. See the LANparty was inside a Dry Cleaners business, and it was the office machine that had just been replaced. Everyone at the LAN was angry at how slow my PC was to load in games, so the guy running the LAN he just gave that P4 2.0Ghz to me. I was pretty much ONLY playing Quake III back then - I was a dueller, one of the best in the UK. But thats ALL I played. The Geforce 4 Ti served me so well, for years. Allowed me to play Q3 at 1024x768 @ 120Hz 120fps. All I needed. Then Doom 3 came out, and Half Life 2. I already knew Far Cry ran terrible on my PC, and I was starting to play around with other games besides Quake III. I remember doing a huge skirmish on Dawn Of War which had serious performance issues, and then trying to load Doom 3 and messing with the console settings to *try* to make it playable, but eventually I got fed up. So I went to eBuyer (the UK's biggest tech retailer back then - possibly still is?), and searched around. I checked dozens of reviews too - obviously - benchmarks etc. Eventually settled on a 6600GT, that card easily offerd the best value at the time, but was strong enough to play Doom 3. My motherboard only had AGP, so it had to be the AGP version, which was more expensive. THEN I saw that 6800XT, for just a few quid more. It made no sense, all the other 6800 cards cost a lot more, bit this one was cheaper even than the more expensive 6600GT's. So I read about them, figured out there was a possibility it could be unlocked (without being unlocked, it really wasn't that much better than a 6600GT), and I took the chance. It must have been around the spring of 2005, because I'd just bought "Escape From Butcher Bay" and that was the first game I tried with the 6800XT. My friend gave me his old P4 3Ghz CPU too, around the time I bought the card. Or maybe he charged me like £30 for it? Some stupid low price, perhaps. Great times back then. The jump from the GF 4 Ti to the 6800 was INSANE. Tech moved so fast! Staying with Quake III so long, I'd fallen behind the times! That new card was jaw dropping to me. I also had a Geforce 2 MX, in that Duron. Originally it had onboard Savage 3000 graphics, *just* enough to run Q3 (barely) in 512x384. The Geforce 2 MX worked wonders when I upgraded and it only cost me £80. I played with a firends ,machine who had a Geforce 3 (he had rich parents) and always wanted one. But that GF2 MX kept me going forever! Up until the GF4 Ti 4800SE.
Great comparison! I love card battles and the results make sense. When you're a regular gamer you're not choosing your game based on the use of VRAM or shaders or anything, you're just looking at the game screenshots and things related to the content in that game. Anyway, I'm glad you're back.
These were the cards I always wanted as a kid, but never could afford. Instead I was blessed with the affordable/infamous FX 5200.😂😂 Although it was the 128 MB version and it ran Call of Duty significantly better than the X800...ok, the resolution was just 1024x768 but my frame rates weren't that much lower (around 60-65 fps, but AA wasn't set to 4 for sure)... luck I guess.😅
I had it even worse some fifteen years ago. My first discrete GPU was a 7300LE which had the so-called TurboCache feature where they had a scant amount of built-in video RAM and would take the rest of it from system RAM as needed. ATI also had their own take on it called HyperMemory too.
@@PixelPipes Sorry, but I think you're wrong on that. The results of synthetic fillrate tests obviously don't have to equal to theoretical maximums. NV43's ROPs weren't decoupled. It was a different setup entirely: 2x2 ALUs / 4 TMUs / 2 ROPs per quad. The ROPs moved outside the pipeline with a crossbar interconnect first with NV47. Edit: A 6800GS wouldn't run almost equally to a 6800GT if NV42 didn't have at least 12 raster units ;)
@ShadowsBehindU ROPs were situated the same for NV41/42, outside the pipeline. You can also see 8 ROPs in die shots of the NV42 posted by Martijn Boer. Evidence trumps assumptions on your part.
For better or worse I still have an X1650 Pro AGP laying around somewhere, it needs to have its fan repaired/replaced but it does still work. I also need to figure out some sort of a solution for the bridge chip because it gets extremely hot and I don't want to kill the card.
@@amdintelxsniperx It is, it's on the back of the card I believe. Not entirely sure as I haven't messed around with it in a few years. Edit: Just found the card; it's an HIS 512 MB model, the bridge chip is indeed on the back. There's a thermal pad already on it but I'm not sure how much good that will do, so I think I will try to find a heatsink for it.
I have a few GeForce 6800s on both AGP and PCIe and I'm pretty happy with all of them. Good enough performance to play most XP games from 2001-2006. It's a really nice card, plus most of them are single slot which is a nice bonus. If I ended up with one back in the day I don't think I would've been disappointed at all.
1600x1200 with 4x AA is going to be about 114mb just for frame buffer. So I am not surprised it caused the ATI card to dump on CoD... As it also uses larger textures than earlier Q3 games. The results at 1024 is interesting to me. And it would be interesting to see this without AA to free up even more VRAM but I suspect the nVidia card will still poke ahead. CoD seemed to favor nVidia cards a small bit.
Yeah I don't know why the framerate was lower at 1024x768. Without AA I'd expect both cards to max out the engine at 90fps (tho at this point I've already been surprised). I know ATI had a reputation for worse OpenGL performance, particularly in idTech3, but it was still peculiar
@@PixelPipes Driver overhead may also still be a slight culprit. And perhaps the significant differences in how the cards handle AA may play a roll. It was sometimes a practice for some reviewers to run a baseline without AA and AF to remove some variables with drivers. As both took vastly different approaches to each. Though it often didn't really mean a whole lot.. lol.
It really depends on what settings and resolution you're going for. Also what games you primarily play and when you buy every game that comes out. 4K is just pushing way too much VRAM on PC.
@@cyphaborg6598 it's not gonna be an issue for just 4k. It's affecting 1440p performance already and in some cases 1080p ray tracing is becoming a problem too. Same happened with those 2 GB gpus back in early ps4 era days. People said 2 GB is fine for 1080p. But within few years those cards couldn't even match Ps4's texture quality. You say it depends on settings and resolution. Well obviously. But what should be our standard? I believe console's graphical setting and resolution should be our bare minimum for a standard. And these 8 GB GPUs won't be able to match consoles current gen console's texture and resolution. 8 GB was simply inexcusable for these recent GPUs. There is no defending this planned obsolesce.
I used to have X850 GT I think. The card was alright, altho the lack of SM3.0 kinda ruined it for me since I couldn't play the latest games that came out later. Still, it played X3 game very well. At least that's how I remember it, it's hard to tell since "playing well" meant anywhere between 30 and 60 FPS.
For my build the reason I picked the 6800GT was shader 3 support while simultaniously having excellent DOS support. Windows 98 leaves a bit to be desired but I plug that hole with a Voodoo 2. The ati cards I know have less dos compatibility and the ones with 98 support lack the newer shader model. So for my as many eras as possible build the 6800GT made it a dos PC that can run crysis.
@@TheVanillatech I am more mental than that. I played sopwith on it to :P. But the theme of the build is to be slow enough to play 80s games and modern enough to play the 2005 era up to crysis 1. This GPU is perfect for that :D
@@Henk717 Without software, you can't play a HUGE number 80's and early 90's DOS games on anything except a 486 or lower. And that software isn't reliable. I always find DOSBOX beats any pentium build, barring a handful of motherboards that have a ton of customizable dips, when it comes to pure DOS builds. I have a K6-III+ machine for DOS games, runs Carmageddon and Quake and TombRaider etc perfectly, also lets me play earlier games by disabling cache and running at 75Mhz, giving me roughly 386DX speeds. Because I never managed to build a decent 486 machine, and I don't have space! XD I only have 3 retro rigs, thanks to DOSBOX Staging.
Yeah I've been a Head-Fi member since 05. Right now I primarily use a Bifrost 2 & Lyr+ with my Arya Stealth/ZMF Auteur/HEDDphone, but I also have a couple vintage pairs
Interesting stuff, but if I _may_ chime in, since it's going to be a fair bit. For point of reference, my Win98 PC uses: A64 3700+ 512MB DDR400 MSI K8N Neo2 Platinum motherboard SB Audigy 2 (platinum, according to drivers, lol) 1280x1024 monitor Through earlier efforts when they were more affordable, ended up with an X800 XTpe entirely because of it being mislabeled in the listing, an X850 XT which confirms they both perform identically, and a geforce 6800GT, since the 6800 XT I had on hand refused to work with the final 98SE drivers from nvidia. Moving on... In 3DMark 01's default benchmark runs, the 6800GT is, similarly, about 1000 points faster on the same build. In Aquamark3, which _I think_ is a more OpenGL suited benchmark program, the results were much more dramatic, in favor of nvidia. WHen I still had my 6600GT, I was getting higher scores from it than even when I still had an X850XT platinum. In Quake 3 Arena, with max in game settings and disabling the framerate cap (com_maxfps = 0), I'd see about 400fps with my remaining Radeon cards, but more like 600 with the geforce 6800GT. It's so obscene I'm literally having to enable the cap because the tearing becomes that distracting. Though one intangible in favor of the Radeon is that the install size and the resources available on a Win9x environment is simply a bit leaner with the Radeon cards. The downer at present is that selecting _reboot to MS-DOS mode_ just returns a blank screen for me with the ATI cards; no such issues with the nvidia card(s). If I had the room to have a _second_ 98SE pc like this one, I'd drop in the geforce card in a heartbeat.
@@PixelPipes wish I knew, but since I can't get MSI afterburner to work on windows 98, I really don't have a way to know. I was wondering how you handle benchmarking in that os to be able to offer more insight? I can also say that on 98se, the GeForce 6 cards all had support for temperature monitoring. Of my remaining Radeon cards, only the x800 xtp gets any temperature monitoring support at all in the drivers.
None of these cards are really ideally used for Windows 98. The nVIDIA stuff tends to fare better. These really do their best in Windows XP. @@ZeroHourProductions407
These were the kings when I got into gaming.... tried an x800xl in my brothers computer but absolutely hated the driver and software experience. Ended up with an agp8x 6800xl on my sempron system and liked the experience much much better and Ive been buying Nvidia cards since....
It depends on what variant of X850. X800XL is a 16-pipe full card. The X850 could be a variety of things. If it's a PRO it's 12 pipe. If it's an XT or XTP it's 16 pipes. The XL and PRO will trade blows, PRO has higher clocks and will almost always be ahead in older fixed-function games, but the XL has a more powerful GPU so will outperform it in many shader-heavy games. The XT is faster than both of them.
It is definitely the fastest then. Only the X850XT Platinum Edition and X800XT Platinum Edition are faster and the difference isn't significant. Even the 6800Ultra Extreme doesn't stand a chance. @@ricardobarros1090
what are your favourite cards - be it for performance, looks, weirdness, overclocking/unlocking ability or whatever you like - for win9x, xp and win 7?
The GF6 cards rank highly. I do like the weirdness of NV41/42 specifically. But V5 5500, Voodoo 1, 9700 Pro, Kyro 2, and Rendition V1000(E/L-P) are also among my favorites. Picking one is impossible and my answer can change week to week lol
@@PixelPipes thanks for replying! And you know, that's the real joy of retro - not necessarily having the rarest tech but being able to explore the history of what top end gaming would have been like and how buying choices played out.
I remember my 6600gt being quicker than an overclocked 6800 an just as quick as a stock 6800gt in up to 1280x1024 resolutions. Didn't need higher on a 21in crt on a sempron at 2.7ghz oc,ed beat rigs costing 2x more...an they laughed at my 35quid 1.7ghz sempron over there presscots an high clocked athlon 3000s..wasn't laughing at a 2.7ghz stable overclock leave them in the dust tho. Best bang for buck rig I built
Best thing about the 6800 generation was that you could unlock the NV40 chips on AGP cards via software. That way you could turn a 6800LE into a 6800GT and almost double the performance for free. Nvidia never made that mistake again ;)
18:55 : ""Halo historically favours ATi architectures""" ????? excuse me but that's ... how can i put it ... : *technically impossible* !!! the original Halo was produced as a *launch-title* for showcasing/promoting the original XBOX over Playstation2 . *The original XBOX was based on a GeForce 3-architecture* , so really , what you say is simply technically impossible *by default* .
Halo was originally designed for Mac, which came with ATI GPUs by default. Bungie didn't completely rebuild their engine from the ground up when it shifted to Xbox, so it naturally still favours ATI's architecture.
Yeah I don't think it was intentional, but the PS2.0 codepath that they wrote for the PC version just runs exceptionally well on ATI's R3XX-R5XX architectures
Not really technically impossible depending how well they optimized the game for ATI cards on PC as they were very popular cards during that time period.
Not really. If you go back and look at old reviews, as long as the number of pipes are the same, the X800 series generally won. X800 Pro with 12 pipes did lose to the 6800GT with its 16 pipes, though
Probably my favourite GPU generations. Radeon X800 was faster in general, but GF 6800 had newer features (SM3.0).
I remember how everyone was buying Geforce 6800 LE and then trough bios-mods they unlock processing units. It had 8 pixel shaders enabled by standard, but some people managed to get 12 units and some even got 16 and trough overclocking made it practically 6800 GT with slightly slower memory.
This Actually mattered a lot. My brother was able to game with one of these until upgrading to a gtx 260 in early 2009 due to sm3 support. A x800 would be obsolete much earlier due to many games from as early as 2007 requiring sm3. That 260 lasted him until 2015 so the GeForce 6 could have had you set up for a path of minimal upgrades. A x800 purchase would have certainly had him upgrading to a 8800gt instead over a year earlier and that certainly wouldn’t have lasted him as long.
Mine too
If he had the foresight to hold out til the 5870 he could have realistically gone 6800 -> 5870 -> 980 and still be gaming on his third gpu since 2004. Crazy to think about
@@sgdude1337 Sometimes I wonder how well I could optimize my PC upgrade history, knowing what I know now about the hardware and what software/games I ended up actually playing.
omg, pixelpipes! It's been forever since your vids popped up in my feed - so glad to see your vids again!
Finally frequent uploads. I genuinely missed your videos and style
Holy crap! You're back. I love your channel man. Glad to see new videos :)
I have the 256mb x800 Pro (AGP 8 version), I’ve owned it since 2004 and finished both Doom 3 and HL2 on it. Love that card.
I own the brother AGP card of that the x850 pro with a stinking huge molex power connector.
My first GPU was a Sapphire Radeon X800 and I sort of wish I'd kept it. Got a Sapphire Radeon HD 5850 Toxic after that, which I also regret not keeping. Probably the sickest looking GPU ever.
good to see you back, i like the way you edit your videos and it's fun to watch these GPUs in retrospective.
thanks for that, i really enjoy your channel, it brings us back and we can get excited about little things, like running doom 3, or call of duty
Just bought Club3d Radeon X800 RX/GT from flea market last week, good catch and cost me only 5€ and worked fine. Thanks for the video :)
Love that time. New generations of GPU, amazing games like Far Cry, HL2, Doom 3....
I once had a 6800 with 128MB, it served me well for quite some time and was a nice upgrade from my previous FX 5900 XT. Now I own a 6800 GT for one of my Retro PCs. It has to serve as replacement for my amazing Gainward 7800 GS 512MB AGP with unlocked pixel pipes that sadly began to show artifacts.
Good idea! I like the format. Not everything has to be fully scripted and produced. Keep up the good work. I'm glad to see you're back.
I have the AGP versions of both these cards, but in a twist of fate the memory configurations are reversed (6800 has 128mb and X800 has 256mb). I was able to unlock all the pipes on both cards for extra performance.
For me all my X800 cards leave the 6800AGP I have in the dust. They all have GDDR3 or 256MB+ The 6800Ultra is a bit faster than some, but the two platinum editions leave everything in the dust. Great cards.
@jam8076 Yeah I upgraded from the 6800 AGP to the X800 256 and after unlocking the cores it was significantly faster than the 6800.
I was only able to afford the 6600GT at the time, but was a fantastic generation of cards, and games.
I had a 6600gt before upgrading to a 6800 ultra, card was a great bang for the buck. The 60 series in nvidia cards don't compare now.
Great card that... overclocked was as fast as a 6800gt in all but huge resolutions with AA an AF
Love this segment, and also I was able to recall what you say as you say along like it was yesterday. I love during this time, I actually bought 6800NU back then that capable of unlock to 16 pipes. Only bottleneck is 128MB of VRAM but it was fast. Nowadays I buy bucketloads of X800 cards because of one feature, ATi Truform :)
Those are the most desirable from that era
Awesome! A new video. Thanks for making my day.
Good to see you back.
I just wanna say what a blessing it always is to see new content from you, I bought one of your "Maximum Rage" shirts, love it dude. Keep doing what you love, and me along with everyone else here will keep enjoying it. :)
Hey thank you! Hope you've been enjoying the shirt!
I miss Ruby. AMD should bring her back.
Good video, Nathan. Around this time, I believe I had the X800 Pro Vivo, though looking back, I think my CPU was restricting its performance by a fair margin.
I had an AGP 6800GS and I flashed the 6800GT BIOS to it to unlock the 4 locked pipelines (12 to 16). Free huge performance boost.
you sure you don't mean the 6800 LE? I remember doing it with a few of them.
Ah yes remember when $199 graphic cards still came with a 256bits memory bus.
Boy that is long ago.
We're already at the point of "Back in my day" stories lol
@PixelPipes I'm Soooooooooooo Glad to see you making gpu videos again Keep them coming and know that your contribution to this hobby/pastime is greatly appreciated !
Good video. The 6800 wins because you can play another 5yrs worth of games because of shader 3 support.
I'd have added a little bit of overclocking to the comparison, to spice things a little bit; some times these middle-high-end cards have quite of room for improvement, especially if they use under-clocked memory modules ;)
Related to the X800...I actually miss my X700 pro. That was the first non-crap ATI card I owned.
9800 pro says hello...
X700 Pro was really good
@@PixelPipes 15% slower than GF 6600 Gt on my benchmark results with Atlon 3700+ 2GB Ram but still decent middle class card that day. It beats the 6600 non GT by far. The 6600 Gt is the clear winner against X700 Pro but the RAdeon X700 Pro was a bit cheaper. Ati had planed to bring the X700 Xt a higher clocked X700 Pro which are closer to the 6600Gt but the 6600GT still wins and the Production costs were to expensive for ATI so they have decided to skip this card and not bring them to the marked instead they have focued to the Radeon X800 non Pro which beats the 6600 GT and they have reduced the price so the X800 where the competitor of the 6600 GT althouth technically this card are equel then the Gefore 6800 (this card is not much faster than the 6600GT) 6800 was much more expensive for a little bit more performance than the 6600 GT. The 6600 GT had the best price-performance for the most gamers and this card was fast enough to handle 2004 and even 2005 games in 1280x Maximum details and Anti Aliasing.
Flashback memories.. man so long ago, Im am not 100% but I think I bought the x800 GTO? It definitely had GTO in its title/marketing.
GTO came late to the game, but was slightly faster than the GT if I recall.
Had the 6800 ultra, upgraded from a 6600gt when I found it open box dirt cheap. Was a great card for the time.
I used to have a X800 AIW with digital TV tunes and video capture. only stopped using it as a second card when i moved to vista as there was no drivers for the tuner. I had a 6800 at one point and it was awful it only had 128mb of ddr1 so was painfully slow for a 6800. ditched that for a x1900XT AIW I think this was back when i was playing UT2004.
2004 nostalgia
Back in the day, I bought a 6800XT AGP on a great deal, a freaky deal, just £25 more than a 6600GT. It also turned out to be A0 revsion PCB, and let me unlock all the masked vertex / pixel pipes with Rivatuner. Was a great card! Insane for what I paid for it! Had been set on a 6600GT but managed to get super lucky on eBuyer that day. Played Riddick Butcher Bay amazingly well.
Now I have a ton of X800 cards. X800XL, X800GT, X800GTO, X800Pro.... all great cards!
The era of unlockable NV40 chips was a great time
@@PixelPipes Yeah I mean, I hoped - but I never expected to get that lucky. Read an article online about how it was possible, but only on certain boards. When it arrived, I was overjoyed. It was a leadtek model. Said "Rev. A0" on the corner of the board!
First pass of 3Dmark 03 netted me around what a 6600GT got. But I unlocked just half the masked pipes, and it went up to 9000. Then I unlocked all of them, and scored over 11,000. No artifacts! But out of caution, I settled on leaving half of them locked. Performance was still great, and I didn't wanna stress my new purchase. Still plenty of power! For a little more than the cost of a 6600GT, the card I was about to buy, I ended up with essentially a 6800GT.
Shoulda bought a lottery ticket that week! XD
@technooby220 Yeah the XT was the step down from the vanilla. Unlocking half the masked pipes brought me to, essentially, a vanilla 6800. But mine even let me unlock ALL the pipes. I did settle on just half though.
Interestingly, the 6600GT beat the 6800XT in lower resolutions. It wasn't until you went up to 1280x1024 and beyond that the 6800 pulled away. But once I'd unlocked it, totally different tier of card. For just £25 more than a 6600GT!
Saved a fortune really! At first I was reluctant to ask my Dad for the extra £25, things were tight back then, and I'd saved forever for a 6600GT (to replace my Geforce 4 Ti 4800SE). When I saw that insane offer on the 6800XT though, I had to visit the royal bank of Dad!
I'm trying to remember, and I think the 6600GT was £145, and they had this 6800XT in stock for £169. At the same time, the top end cards like the 6800GT were listed for £230+ . Well out of my price range! But I got it in the end! Thanks to Rivatuner and my Dad! :D
@technooby220 Well, I originally bought a Geforce 4 Ti 4800SE because I'd been given a Dell Optiplex mid tower, with a Pentium 4 2.0Ghz, at a LAN party one time. It replaced my aging Duron 700. But it lacked a GPU. See the LANparty was inside a Dry Cleaners business, and it was the office machine that had just been replaced. Everyone at the LAN was angry at how slow my PC was to load in games, so the guy running the LAN he just gave that P4 2.0Ghz to me. I was pretty much ONLY playing Quake III back then - I was a dueller, one of the best in the UK. But thats ALL I played. The Geforce 4 Ti served me so well, for years. Allowed me to play Q3 at 1024x768 @ 120Hz 120fps. All I needed.
Then Doom 3 came out, and Half Life 2. I already knew Far Cry ran terrible on my PC, and I was starting to play around with other games besides Quake III. I remember doing a huge skirmish on Dawn Of War which had serious performance issues, and then trying to load Doom 3 and messing with the console settings to *try* to make it playable, but eventually I got fed up. So I went to eBuyer (the UK's biggest tech retailer back then - possibly still is?), and searched around. I checked dozens of reviews too - obviously - benchmarks etc.
Eventually settled on a 6600GT, that card easily offerd the best value at the time, but was strong enough to play Doom 3. My motherboard only had AGP, so it had to be the AGP version, which was more expensive. THEN I saw that 6800XT, for just a few quid more. It made no sense, all the other 6800 cards cost a lot more, bit this one was cheaper even than the more expensive 6600GT's. So I read about them, figured out there was a possibility it could be unlocked (without being unlocked, it really wasn't that much better than a 6600GT), and I took the chance. It must have been around the spring of 2005, because I'd just bought "Escape From Butcher Bay" and that was the first game I tried with the 6800XT.
My friend gave me his old P4 3Ghz CPU too, around the time I bought the card. Or maybe he charged me like £30 for it? Some stupid low price, perhaps.
Great times back then. The jump from the GF 4 Ti to the 6800 was INSANE. Tech moved so fast! Staying with Quake III so long, I'd fallen behind the times! That new card was jaw dropping to me.
I also had a Geforce 2 MX, in that Duron. Originally it had onboard Savage 3000 graphics, *just* enough to run Q3 (barely) in 512x384. The Geforce 2 MX worked wonders when I upgraded and it only cost me £80. I played with a firends ,machine who had a Geforce 3 (he had rich parents) and always wanted one. But that GF2 MX kept me going forever! Up until the GF4 Ti 4800SE.
Great comparison! I love card battles and the results make sense. When you're a regular gamer you're not choosing your game based on the use of VRAM or shaders or anything, you're just looking at the game screenshots and things related to the content in that game. Anyway, I'm glad you're back.
These were the cards I always wanted as a kid, but never could afford. Instead I was blessed with the affordable/infamous FX 5200.😂😂 Although it was the 128 MB version and it ran Call of Duty significantly better than the X800...ok, the resolution was just 1024x768 but my frame rates weren't that much lower (around 60-65 fps, but AA wasn't set to 4 for sure)... luck I guess.😅
I had it even worse some fifteen years ago. My first discrete GPU was a 7300LE which had the so-called TurboCache feature where they had a scant amount of built-in video RAM and would take the rest of it from system RAM as needed. ATI also had their own take on it called HyperMemory too.
I always thought the TurboCache and Hyper Memory cards might make a good Halloween episode lol
The flipside of the 6600GT, where the AGP card had slower memory.
Yes...I'll never understand why that is
NV41/NV42 have the same quad cluster render pipeline design as the NV40 -> 3 clusters = 12TMUs & 12 ROPs.
They have decoupled ROP units, similar to NV43. Unfortunately TPU (and by extension GPU-Z) has it wrong, but fillrate tests accurately show 8 ROPs
@@PixelPipes Sorry, but I think you're wrong on that. The results of synthetic fillrate tests obviously don't have to equal to theoretical maximums. NV43's ROPs weren't decoupled. It was a different setup entirely: 2x2 ALUs / 4 TMUs / 2 ROPs per quad. The ROPs moved outside the pipeline with a crossbar interconnect first with NV47.
Edit: A 6800GS wouldn't run almost equally to a 6800GT if NV42 didn't have at least 12 raster units ;)
@ShadowsBehindU ROPs were situated the same for NV41/42, outside the pipeline. You can also see 8 ROPs in die shots of the NV42 posted by Martijn Boer. Evidence trumps assumptions on your part.
@@PixelPipes I know of Martin Boer's die shots and I count twelve :)
Unfortunately I don't have a block diagram for 41/42 specifically.
Saw the thumbnail.. PERSONAL INTEREST! lol... have an X700 AGP in a drawer.
For better or worse I still have an X1650 Pro AGP laying around somewhere, it needs to have its fan repaired/replaced but it does still work. I also need to figure out some sort of a solution for the bridge chip because it gets extremely hot and I don't want to kill the card.
@@AliceC993 also had the x1950 pro...so I finally relinquished my hold on AGP then and got the pci express. In any case, good luck!
@@AliceC993 if the bridge chip is exposed you can either 1 put a pad to have it touch the heatsink or 2 put a small thin ram sink on it
@@amdintelxsniperx It is, it's on the back of the card I believe. Not entirely sure as I haven't messed around with it in a few years.
Edit: Just found the card; it's an HIS 512 MB model, the bridge chip is indeed on the back. There's a thermal pad already on it but I'm not sure how much good that will do, so I think I will try to find a heatsink for it.
X700Pro is the one card I've never been able to find or aquire in ALL these years. Have around 6-7 X800 based cards, not a single X700.
I've never seen one of those at the dump. Also all the X800 AGP cards I ever found never worked.
I have a few GeForce 6800s on both AGP and PCIe and I'm pretty happy with all of them. Good enough performance to play most XP games from 2001-2006. It's a really nice card, plus most of them are single slot which is a nice bonus. If I ended up with one back in the day I don't think I would've been disappointed at all.
1600x1200 with 4x AA is going to be about 114mb just for frame buffer. So I am not surprised it caused the ATI card to dump on CoD... As it also uses larger textures than earlier Q3 games. The results at 1024 is interesting to me. And it would be interesting to see this without AA to free up even more VRAM but I suspect the nVidia card will still poke ahead. CoD seemed to favor nVidia cards a small bit.
Yeah I don't know why the framerate was lower at 1024x768. Without AA I'd expect both cards to max out the engine at 90fps (tho at this point I've already been surprised). I know ATI had a reputation for worse OpenGL performance, particularly in idTech3, but it was still peculiar
@@PixelPipes Driver overhead may also still be a slight culprit. And perhaps the significant differences in how the cards handle AA may play a roll. It was sometimes a practice for some reviewers to run a baseline without AA and AF to remove some variables with drivers. As both took vastly different approaches to each. Though it often didn't really mean a whole lot.. lol.
To the people that say 8 GB vram is fine in 2023, take a look at this. Low vram was always a problem
It really depends on what settings and resolution you're going for.
Also what games you primarily play and when you buy every game that comes out.
4K is just pushing way too much VRAM on PC.
@@cyphaborg6598 it's not gonna be an issue for just 4k. It's affecting 1440p performance already and in some cases 1080p ray tracing is becoming a problem too.
Same happened with those 2 GB gpus back in early ps4 era days. People said 2 GB is fine for 1080p. But within few years those cards couldn't even match Ps4's texture quality.
You say it depends on settings and resolution. Well obviously. But what should be our standard? I believe console's graphical setting and resolution should be our bare minimum for a standard. And these 8 GB GPUs won't be able to match consoles current gen console's texture and resolution.
8 GB was simply inexcusable for these recent GPUs. There is no defending this planned obsolesce.
I had to unsubscribe from Adrian's Digital Basement Channel because he stopped doing shorter videos. I just prefer shorter videos because of time.
GF 6800: "Time for some Win98 gaming!?"
R X800: "Nope."
There are official, but beta Win98 drivers for the respective Catalyst versions for that card's generation.
I like this format.
:D
I used to have X850 GT I think. The card was alright, altho the lack of SM3.0 kinda ruined it for me since I couldn't play the latest games that came out later. Still, it played X3 game very well. At least that's how I remember it, it's hard to tell since "playing well" meant anywhere between 30 and 60 FPS.
Another one!!!! Woot!!!!
For my build the reason I picked the 6800GT was shader 3 support while simultaniously having excellent DOS support. Windows 98 leaves a bit to be desired but I plug that hole with a Voodoo 2.
The ati cards I know have less dos compatibility and the ones with 98 support lack the newer shader model. So for my as many eras as possible build the 6800GT made it a dos PC that can run crysis.
"DOS" support with a 6800GT? Are you mental? :D
Day of the Tentacle on a Geforce 6800GT :DDDDD
@@TheVanillatech I am more mental than that. I played sopwith on it to :P. But the theme of the build is to be slow enough to play 80s games and modern enough to play the 2005 era up to crysis 1. This GPU is perfect for that :D
ATI drivers from the era had notoriously poor driver support.
@@Henk717 Without software, you can't play a HUGE number 80's and early 90's DOS games on anything except a 486 or lower. And that software isn't reliable. I always find DOSBOX beats any pentium build, barring a handful of motherboards that have a ton of customizable dips, when it comes to pure DOS builds.
I have a K6-III+ machine for DOS games, runs Carmageddon and Quake and TombRaider etc perfectly, also lets me play earlier games by disabling cache and running at 75Mhz, giving me roughly 386DX speeds. Because I never managed to build a decent 486 machine, and I don't have space! XD
I only have 3 retro rigs, thanks to DOSBOX Staging.
@@harryshuman9637 Apparently, drivers have bad driver support....
The stupidity of Nvidia fans knows no bounds! XD
Oh wow you're alive
Looks like you have pretty nice audio setup. What equipment you use?
Yeah I've been a Head-Fi member since 05. Right now I primarily use a Bifrost 2 & Lyr+ with my Arya Stealth/ZMF Auteur/HEDDphone, but I also have a couple vintage pairs
@@PixelPipes Thanks for your answer!
Interesting stuff, but if I _may_ chime in, since it's going to be a fair bit. For point of reference, my Win98 PC uses:
A64 3700+
512MB DDR400
MSI K8N Neo2 Platinum motherboard
SB Audigy 2 (platinum, according to drivers, lol)
1280x1024 monitor
Through earlier efforts when they were more affordable, ended up with an X800 XTpe entirely because of it being mislabeled in the listing, an X850 XT which confirms they both perform identically, and a geforce 6800GT, since the 6800 XT I had on hand refused to work with the final 98SE drivers from nvidia. Moving on...
In 3DMark 01's default benchmark runs, the 6800GT is, similarly, about 1000 points faster on the same build.
In Aquamark3, which _I think_ is a more OpenGL suited benchmark program, the results were much more dramatic, in favor of nvidia. WHen I still had my 6600GT, I was getting higher scores from it than even when I still had an X850XT platinum.
In Quake 3 Arena, with max in game settings and disabling the framerate cap (com_maxfps = 0), I'd see about 400fps with my remaining Radeon cards, but more like 600 with the geforce 6800GT. It's so obscene I'm literally having to enable the cap because the tearing becomes that distracting.
Though one intangible in favor of the Radeon is that the install size and the resources available on a Win9x environment is simply a bit leaner with the Radeon cards. The downer at present is that selecting _reboot to MS-DOS mode_ just returns a blank screen for me with the ATI cards; no such issues with the nvidia card(s). If I had the room to have a _second_ 98SE pc like this one, I'd drop in the geforce card in a heartbeat.
Interesting. I wonder why the X850XT is performing so low?
@@PixelPipes wish I knew, but since I can't get MSI afterburner to work on windows 98, I really don't have a way to know. I was wondering how you handle benchmarking in that os to be able to offer more insight?
I can also say that on 98se, the GeForce 6 cards all had support for temperature monitoring. Of my remaining Radeon cards, only the x800 xtp gets any temperature monitoring support at all in the drivers.
None of these cards are really ideally used for Windows 98. The nVIDIA stuff tends to fare better. These really do their best in Windows XP. @@ZeroHourProductions407
These were the kings when I got into gaming.... tried an x800xl in my brothers computer but absolutely hated the driver and software experience. Ended up with an agp8x 6800xl on my sempron system and liked the experience much much better and Ive been buying Nvidia cards since....
I have 3 beatiful ATI cards, X800 xl and X850 and geforce 6800gs call Nvidia Quadro. The best card is ATI X850?
It depends on what variant of X850. X800XL is a 16-pipe full card. The X850 could be a variety of things. If it's a PRO it's 12 pipe. If it's an XT or XTP it's 16 pipes. The XL and PRO will trade blows, PRO has higher clocks and will almost always be ahead in older fixed-function games, but the XL has a more powerful GPU so will outperform it in many shader-heavy games. The XT is faster than both of them.
@@classic_jam I have ATI X850 XT
It is definitely the fastest then. Only the X850XT Platinum Edition and X800XT Platinum Edition are faster and the difference isn't significant. Even the 6800Ultra Extreme doesn't stand a chance. @@ricardobarros1090
what are your favourite cards - be it for performance, looks, weirdness, overclocking/unlocking ability or whatever you like - for win9x, xp and win 7?
The GF6 cards rank highly. I do like the weirdness of NV41/42 specifically. But V5 5500, Voodoo 1, 9700 Pro, Kyro 2, and Rendition V1000(E/L-P) are also among my favorites. Picking one is impossible and my answer can change week to week lol
@@PixelPipes thanks for replying! And you know, that's the real joy of retro - not necessarily having the rarest tech but being able to explore the history of what top end gaming would have been like and how buying choices played out.
late era of "Pixel Pipes" GPUs was really interesting
audio is a bit peaky at times but otherwise great video =^-^=
Yeah the audio in this is garbage, sorry about that. It will be better next time
Hello Nathan! See you next live stream at Rik's Random Retro
Yessir! Good to see you Bruce!
I remember my 6600gt being quicker than an overclocked 6800 an just as quick as a stock 6800gt in up to 1280x1024 resolutions. Didn't need higher on a 21in crt on a sempron at 2.7ghz oc,ed beat rigs costing 2x more...an they laughed at my 35quid 1.7ghz sempron over there presscots an high clocked athlon 3000s..wasn't laughing at a 2.7ghz stable overclock leave them in the dust tho. Best bang for buck rig I built
Best thing about the 6800 generation was that you could unlock the NV40 chips on AGP cards via software. That way you could turn a 6800LE into a 6800GT and almost double the performance for free. Nvidia never made that mistake again ;)
my x800pro was 256mb?
Ruby on cooler, yeah!
i had a 256 MB x850 GTO (AGP ) version.
ive never seen a 128 x800. i had one when i was a kid and it was a 256 vivo that i modded to an x850
sorry, but background music irritating and don't alow to concentrate on speach
for me it is fine
Nvidia cards are terrible under Windows 98, that's why I choose ATi.
18:55 : ""Halo historically favours ATi architectures""" ?????
excuse me but that's ... how can i put it ... : *technically impossible* !!!
the original Halo was produced as a *launch-title* for showcasing/promoting the original XBOX over Playstation2 . *The original XBOX was based on a GeForce 3-architecture* , so really , what you say is simply technically impossible *by default* .
it can be accidentally the case
Halo was originally designed for Mac, which came with ATI GPUs by default. Bungie didn't completely rebuild their engine from the ground up when it shifted to Xbox, so it naturally still favours ATI's architecture.
Yeah I don't think it was intentional, but the PS2.0 codepath that they wrote for the PC version just runs exceptionally well on ATI's R3XX-R5XX architectures
Not really technically impossible depending how well they optimized the game for ATI cards on PC as they were very popular cards during that time period.
Bump
mapping
6800 was a much better card back then , all version. Idk which drivers is used here
Not really. If you go back and look at old reviews, as long as the number of pipes are the same, the X800 series generally won. X800 Pro with 12 pipes did lose to the 6800GT with its 16 pipes, though