For mobile users: 00:00 Intro 00:29 Sponsor 00:39 Look at the card 01:10 Specifications of the GeForce3 Ti 200 01:52 Mafia (2002) 02:46 D3D 8 in 3DMark 2001 03:21 Half-Life 2 D3D 7 vs D3D 8 04:55 Shadow Buffers in Splinter Cell 06:12 Chameleon Tech Demo 06:59 Zoltar Tech Demo 07:29 Lightspeed Memory Architecture 08:22 Doom 3 (2004) 08:22 Multisample anti-aliasing 09:19 High Definition Video Processor 09:30 Drivers from Nvidia website 09:58 Look at the drivers 10:18 Overclocking 10:51 The test system 11:20 Benchmark results and graphs 14:38 Power draw 15:29 Checking eBay for prices and availability 16:48 GTA Vice City 16:48 Summary
Always saw the GF3 as not "for the masses", even in it's cheaper Ti 200 variant, it always had that "premium" feel to it, and by the time it actually got cheaper, better cards like the GF4 Ti 4200 came out which were much more "people's cards". An interesting fact is that the original Xbox used a modified GF3 with some more advanced functions.
If you purchased an average PC around 2001 to 2002 high chances were that the Compaq or HP with a P4 2.0Ghz SDR chipset would have the Nvidia GeForce 3 200Ti. Which was a huge thing and technically for the masses.
I remember that when this card was released, I read in a PC magazine of this revolutionary programmable chip, able to code your own "effects". They went very far with the programmable thing marketing. By the time I was just moving from an ATI Rage 128 to a GeForce 2 MX so I was blown away by the GF3. I never knew if this programming engine was a unique GF3 environment or if it was the currently PS/VS programming model executed through unified shaders.
IIRC, I had a DirectX 8 card, maybe a Geforce4 TI, and later upgraded to a DX 9 card, possibly the 6800 or a higher model FX 5000 series. One of the first things I noticed in Half-Life 2 was that the magnifying glass on the doctor's desk near the beginning in DX8 just looked more like a piece of regular glass but in DX9, it looked like a real magnifying glass. Might be something to check out in your tests.
Splinter Cell is one of the few games I have ever actually completed start to finish. It was way ahead of it's time. The nvidia tech demos really lend to the idea I have seen talked about before that the GF3 was never used to it's full potential with games. Certainly though, I feel this way about many games of the "naughties" when you look at the quality 3DMARK has in their demo's compared to the games. For example, the leaf texture and detail in 2001, it took years after that demo for games to have that kind of detail for trees. I also find it funny how the GF3 cards are all pretty much right in between the ATI 7500 and 8500. Nvidia was, and perhaps still is, known to "pad" their drives for 3DMARK performance which didn't translate to games, but still, if that did in fact happen, they couldn't pad their drivers enough to keep up with ATI's 8500 :) Last note, I personally would have thought this card to have come out of a sony vaio since those were the most dominate computer at this point in time to find with front AV ports to plug up your camera.
Back in the day, I got the ti200. I may have overclocked it. Either way, it was my 2nd AGP card and it never disappointed until it slowly became irrelevant. I used it to play Doom 3 on a 640x480 72" projector screen. That combo provided a great experience at max detail... CPU was a Thunderbird 1.0Ghz OCed to 1.4Ghz (using the pencil trick) on an ASUS A7V motherboard.
This was my first 'proper' gfx card, paired it with an Athlon 1ghz and iirc 128mb of ram, it was a beast, ran medal of honor allied assault gta 3 etc very well at high res for the time. Loved it. Just my power supply wasn't great and it could be unstable, the after market wasn't great for case upgrades back then, so I just dealt with the crashes
I know your channel is primarily PC focused, but the Xbox truly was an efficient and cheap console for the PC hardware it was based on. A comparison for equivalent PC versus the Xbox would be interesting. Looking back, the Xbox really was a great value, a GeForce 3 gaming PC for only $300.
Hi Phil, The Xbox used a 733 MHz Pentium III, GeForce 3 GPU, 64 MB RAM, NT kernel from Windows 2000, 8-10GB hard-drive, DVD-ROM, native 5.1 Dolby Digital Live/Surround encoding, and 480p, 720p, 1080i support over component video. A PC with the same hardware would be no match in performance, but it would be interesting to see what hardware it would take to be on par with the OG Xbox. Full technical specs here: en.wikipedia.org/wiki/Xbox_technical_specifications
You will need at least 1 GHz CPU to get XBOX performance level, or preferably 1,5 GHz. Windows kernel from Xbox was designed for game performance in mind, and DirectX implementation was fully integrated with GPU. On consoles in general it takes way less CPU cycles to process graphics data to GPU. It's the case even with upcoming Xbox One X. Microsoft did full hardware implementation of DirectX 12, so it's possible that Scorpio Core could outperform fastest intel Core CPUs in terms of DirectX API.
I never played splinter cell on pc. I was blown away by the graphics of the game when i played it on the xbox. i had never seen lighting and shadows like that before.
Well, one thing still about the Radeon cards for the DX8/8.1 era, is the fact that they cannot do trilinear texture filtering atop of anisotropic texture filtering. In such a case, that is where the GF3 has an advantage. So filtering is much better than the 8500/9000/92x0.
PhilsComputerLab Anytime, mate! I've read a lot of reviews of it back then and now. The AF overall was comparable to the GF3's higher levels, but the mipmapping when colourized (yes, a Yank like me sometimes prefers that spelling) shows the mipmaps are not smooth gradients on the Radeon.
AF was hardly usable before angle-based optimizations (you don't need 16X AF on triangles that are facing you and on triangles that are just a little angled you need less than the full 16X). The slight reduction in quality came from not correctly categorizing triangles that were off-axis (i.e. not a wall, floor or ceiling, but e.g. a 30 degree slope viewed from the side); in those circumstances you got less filtering than you asked for (that's what those "lobes" in the coloured mipmaps on the inside of that tube are about). I took several generations after this until both nVidia and AMD did these optimizations correctly. For a while Nvidia got into hot water for "brilinear" filtering. That's when you use bilinear filtering near the sweet spot and compress the trilinear filtering into a small interval where you need to transition from one mipmap to another.
Always wanted the Ti500. Programmable pixel and vertex shaders were a game changer. I remember seeing the 3dmark 2001 and GeForce 3 demos and being blown away. It wasn't the slam dunk nVidia hoped for, as the XBox variant of the chip had many production issues. Which is why even to this day you don't see NV chips in XBox consoles. It was a ground breaking architecture, though, no doubt about it. And wow, those Radeon results are insane now. The picture was so different back then.
Wow, I pulled exactly that Medion PC from the scrap metal box at work. And was just looking for a review of that card :D I hope everything is still working.
One of my favorite 3d cards. I played Freelancer to the point of obsession back in the day and this card made it sing along with the P3 1.0 gig I had it paired with. Still have it, although I think it's shot as I get nothing but artifacts when I boot up with it. Excellent review!
Immediately or after a while? Thermal paste can be all dried up and then you'll get artifacts after a little while. If it's noise and clutter from the start, probably one of the memories is damaged.
soylentgreenb I will have to install it and observe again. But I think it was right off. I also have a voodoo 3 2000 that give me lines the instant I start it up. I even tried the oven method (nothing to lose ha) and still bad. But that is a good suggestion on the thermal paste. I will have to check all my old graphic cards now just as a precaution! Thanks!
This was my first GeForce card after Voodoo5, it has significant improvements on image and video output quality, a lot better than GeForce 2MX. Also with excellent driver compatibility with games, I keep using nVidia video card up to now.
Hi brother, i love your vids, you're very much into the stuff i was into. I actually came onboard with graphics cards SLIGHTLY late. I just missed the Geforce 256 launch. (My first computer was a Pentium III 600MHz with onboard graphics). The first actual video card i bought was a Riva TNT 2 32MB card. My friend had bought the 16mb SD card VooDoo 3 200. Even though on paper my TNT2 whooped it's ass, the VooDoo blew it away. Remember trying to play Desu Ex without Glide from the VooDoo? I broke of a capacitor and pretended it didn't work, and as i knew the guy who owned the shop, he gave me a voodoo 2 instead haha. My next big purchase was an AMD Athlon 1300 (i think it was the 1300) with the "AXIA" Stepping. I found out the L1 and L2 bridges and i actually joined them with a pencil lead line allowing it to be unlocked! and i was able to overclock that chip from 1.3Ghz up to 1.6GHz on air, which was INSANE back then to get a 300MHz overclock. Axia was FAMOUS, one of the first true overclocking cards that got everyone addicted. My next purchase was i think the Gainward nVidia Geforce 2 GTS. And then i bought an ELSA Gladiac Geforce 2 ULTRA (Man i can't find any pictures of it anymore and the card was sold with my pold machine years ago) but the card was THE most beautiful thing i've ever seen, the gladiac Ultra has a dark blue PCB, and GLOW IN THE DARK GPU cooler and heatsinks... i can't find a picture of it anymore... damn man it was beautiful. Check it out, Elsa (german company) Gladiac Geforce 2 Ultra 64MB DDR
The Radeon cards had dedicated T&L unit and a vertex shader and they were able to use them both in 3DMark 2001 for transform and lighting. While the GeForce3 had only a vertex shader and used a vertex shader code to simulate the T&L unit.
Phil I really hope that when it comes to reviewing FX5200 that you get the 128 bit mem version, because nowadays I've noticed that on every review or test out there they're using the crippled 64 bit version which is significantly slower than the basic stock one.
Wow, another Medion GF 3 Ti 200. Got one not long ago for my retro rig, they are surprisingly cheap even today. Only thing I did to it was adding a fan header to the two solder mounts on the PCB (halfway between the end of the AGP port and the PCI slot cover) Teh fastest drivers for the Geforce 3 are actually of the 4x series, especially the 45.23
Merry Xmas Phil, I tried the first Win98 driver for my Gf3 ti 200 (v8.05 I believe) and the lowest Anti Aliasing available was 1x2. The games looked a lil better without any noticeable difference in performance. I'm disappointed that this option was removed from the later drivers esp for geforce4 & Fx. 2x AA or higher always makes my games sluggish.
Lol I remember when I was 8 and trying to get my pc to work with minecraft, then this is the first GPU I had in a PC, lasted me another 2-3 years before I got a newer pc, the card was brilliant, for the pc that I had been using since 2007 it was great, had the 128mb siluro variant, very cool card with the black pcb
I actually have an old Gainward GeForce3 Ti 200 sat under my desk which came out of my dad's old PC. I've tried to get that thing to run dozens of times, but I think the card is either dead or bricked. I tried a number of different things to diagnose the fault/issue, but with no success.
I have three of these Medion cards here. Unfortunately, the electrolytic capacitors are slowly breaking, some are arched, others have already expired. Will soon install one in the Athlon Thunderbird with 1.4 GHz. But first I have to replace the capacitors.
Hi Phil! Amazing video, as usual. I also got mine GeForce3 from electromyne:-) its the exact card.. Do you think i should put a fan on it? My Pentium 4 pc is soon finished, just the cpu fan and graphic card i still wait for, it will be a win 98 natural
Hey phil, what os do you recommend for a pentium pro 200mhz? im planning to build a classic gaming rig with it and use a sound blaster live and 8mb ati 3d card. I was thinking 98 or 2000
Hello Phil, maybe you can use 1T timing for the AMD, i have the same mobo, you can adjust this through bios-2T hampers AMD performance even from the Athlon XP era. Medion cards use 500mhz DDR memory, please check if you can get these to 500 mhz. I can't get mine although it is rated by manufacturer to 500 mhz, is this strange or not? Any ideas? The card runs very hot with the passive cooling.
I know, I had 1T initially, you can see it in the build video. But then I decided to just use BIOS defaults. Sometimes the machine resets CMOS, so I just load the defaults and off I go. I got to have it consistent so that's why :)
Let me know if the ti500 oc will work for you. Core should be "easy" with some active cooling, the ram was always the hard one. Mine does not work but the card has some bad caps on it, maybe it is for this reason..i will replace these and let you know. In general the case of medion ti200 is interesting. Keep up the good work with the videos :)
Oh fuck. Forget it. I have thought you are British. Dat accent. No. Just forget it. There is no way it would be reasonable to ship dat thing to Australia.
Most likely germany or nearby country, since aldi is a europe mostly shop, and when that pc was on the market(2001) it was most likely in germany/denmark/austria.
Phil, go and learn about EEPROMS (i2c and spi ones), get eeprom programmer and read eeprom from gpu, maybe pci device id is stored in eeprom and you can change it to try to change it to ti 500 :D
There is a chip on the card, and through resistors voltages can be changed, which will set the device ID. This is documented with some cards, but a lot of that information is gone. I was hoping someone would know and post some information, but it seems to be super rare knowledge.
etutorials.org/Misc/pc+hardware+tuning+acceleration/Chapter+14+Hardware+Acceleration+of+Video+Adapters/Modifying+and+Overclocking+GeForce3+Ti500+Ti200/ on buck converter u can modify feedback increase voltage holy shit this is soo simple you can modify feedback divider (ref voltage 1.25v) and set duty cycle bit more to give it higher voltage but i dont have card on hand to see how to mod pci id.
+BeastGamer It entirely depends on what you're trying to do. An FX 5700 or FX 5900 would be significantly faster and have almost the same compatibility; you can crank up the anisotropic filtering and it will look a lot better. Unless you have a fast CPU; like a pentium 4, athlon XP or athlon 64 it's going to be CPU bound and you're not going to benefit much from its speed. Is that what you want? A 3DFX Voodoo 1 will run ancient 3D accelerated DOS games that are very difficult to run on more recent hardware; running anything more modern than Half-life will chug badly. It's "authentic", what these games would have really felt like back in the day ca 1996-1998. Is that what you want? A 3DFX Voodoo 5 is the fastest voodoo card you can relatively easily get your hands on and it has a cult following like the Harley Davidson motorcycle. When the card was released most people kind of thought it was a piece of shit; just like the Harley. Good glide compatibility, about as fast as a geforce 256; quite expensive. The geforce 3 is the first direct X 8 card from nVidia. It's quite fast for windows 98 gaming; AA is somewhat useable where as SSAA really wasn't even in quite ancient games. Anisotropic filtering is done correctly without optimization (i.e. if you want 16X, you're going to get 16X even where you don't need it), but that makes it slow to the point of being unusable (maybe 2-4x is OK). Is this what you want? The geforce ti 4200 is probably cheaper and it's a little bit faster, but otherwise the same thing as a geforce 3. The geforce 4 MX440 or geforce FX5200 is cheaper than dirt and plenty powerful for anything made before 2000. Except for AA and AF there's no need for anything faster if you're into late 90's gaming. These are available fanless.
I have the original release from 2004 which should be running on DirectX9? I only get 8500 points in the 3DMark 2000 (No matter what Resolution and Lvl of Details i have). His GF2 GTS is faster than mine ^^
Nvidia was almost always second best in terms of raw performance except the 8000 series gpus if I racall corectly. They just had better drivers and PR machine. Sucks the world works like this.
I got a Pentium 3 866mhz, ASUS CUV4X-C, 256 mb ram, and a Geforce3 Ti 200. OS is Win 98 SE. I can not run it in AGP 4x. Drivers says only 2x. Nothing helped. Wether the latest chipset (VIA VT82C694X) drivers nor the latest Nvidia drivers. The latest Nvidia drivers only caused problems, so I installed 56.64. BIOS is on the latest non Beta version. Somebody knows about the problem? I know there is no difference between 4x and 2x and that the CPU will hold my GPU back anyways, but I can not sleep anymore. Btw nice video. Greetings from Germany :D
Yes this is an issue with VIA chipset. It isn't stable at 4x, so to protect the system, the NV driver will switch to 2x if it detects VIA chipset. You can switch to Radeon card, it will run at 4x but you might see crashes. If your BIOS has AGP strength option, you can play around with that and get the Radeon stable. But yea on a Pentium III system I recommend Intel 815 chipset board.
electro myne are extortionate prices! €60 euro for a Geforce 9800gt = £54 british pounds to me... £54!!!! for a 9800gt! thats available on ebay for around £15-£20!!!
For mobile users:
00:00 Intro
00:29 Sponsor
00:39 Look at the card
01:10 Specifications of the GeForce3 Ti 200
01:52 Mafia (2002)
02:46 D3D 8 in 3DMark 2001
03:21 Half-Life 2 D3D 7 vs D3D 8
04:55 Shadow Buffers in Splinter Cell
06:12 Chameleon Tech Demo
06:59 Zoltar Tech Demo
07:29 Lightspeed Memory Architecture
08:22 Doom 3 (2004)
08:22 Multisample anti-aliasing
09:19 High Definition Video Processor
09:30 Drivers from Nvidia website
09:58 Look at the drivers
10:18 Overclocking
10:51 The test system
11:20 Benchmark results and graphs
14:38 Power draw
15:29 Checking eBay for prices and availability
16:48 GTA Vice City
16:48 Summary
Saxie81 thanks
My first "propper" GPU after 3DFX. Absolute killer card for the time.
I had one at the time. Shaders were amazing in those years, unlike RTX technologies today.
Always saw the GF3 as not "for the masses", even in it's cheaper Ti 200 variant, it always had that "premium" feel to it, and by the time it actually got cheaper, better cards like the GF4 Ti 4200 came out which were much more "people's cards". An interesting fact is that the original Xbox used a modified GF3 with some more advanced functions.
If you purchased an average PC around 2001 to 2002 high chances were that the Compaq or HP with a P4 2.0Ghz SDR chipset would have the Nvidia GeForce 3 200Ti. Which was a huge thing and technically for the masses.
I remember that when this card was released, I read in a PC magazine of this revolutionary programmable chip, able to code your own "effects". They went very far with the programmable thing marketing. By the time I was just moving from an ATI Rage 128 to a GeForce 2 MX so I was blown away by the GF3. I never knew if this programming engine was a unique GF3 environment or if it was the currently PS/VS programming model executed through unified shaders.
Yeah, I have exact the same Medion card here in my 1GHz Slot 1 Coppermine FSB100 SL4KL Sys. :)
Great review.
It's actually amazes me how mid-range GPU with passive cooling not only runs just fine after 16 years but also quite overclockable.
IIRC, I had a DirectX 8 card, maybe a Geforce4 TI, and later upgraded to a DX 9 card, possibly the 6800 or a higher model FX 5000 series. One of the first things I noticed in Half-Life 2 was that the magnifying glass on the doctor's desk near the beginning in DX8 just looked more like a piece of regular glass but in DX9, it looked like a real magnifying glass. Might be something to check out in your tests.
Splinter Cell is one of the few games I have ever actually completed start to finish. It was way ahead of it's time. The nvidia tech demos really lend to the idea I have seen talked about before that the GF3 was never used to it's full potential with games. Certainly though, I feel this way about many games of the "naughties" when you look at the quality 3DMARK has in their demo's compared to the games. For example, the leaf texture and detail in 2001, it took years after that demo for games to have that kind of detail for trees.
I also find it funny how the GF3 cards are all pretty much right in between the ATI 7500 and 8500. Nvidia was, and perhaps still is, known to "pad" their drives for 3DMARK performance which didn't translate to games, but still, if that did in fact happen, they couldn't pad their drivers enough to keep up with ATI's 8500 :)
Last note, I personally would have thought this card to have come out of a sony vaio since those were the most dominate computer at this point in time to find with front AV ports to plug up your camera.
ati fanboy detected, amd peasant confirmed.
Back in the day, I got the ti200. I may have overclocked it. Either way, it was my 2nd AGP card and it never disappointed until it slowly became irrelevant. I used it to play Doom 3 on a 640x480 72" projector screen. That combo provided a great experience at max detail... CPU was a Thunderbird 1.0Ghz OCed to 1.4Ghz (using the pencil trick) on an ASUS A7V motherboard.
This was my first 'proper' gfx card, paired it with an Athlon 1ghz and iirc 128mb of ram, it was a beast, ran medal of honor allied assault gta 3 etc very well at high res for the time. Loved it. Just my power supply wasn't great and it could be unstable, the after market wasn't great for case upgrades back then, so I just dealt with the crashes
Cool! Your channel delivers exactly the content I'm looking for. Lot of nostalgic value. Subscribed :)
I know your channel is primarily PC focused, but the Xbox truly was an efficient and cheap console for the PC hardware it was based on. A comparison for equivalent PC versus the Xbox would be interesting. Looking back, the Xbox really was a great value, a GeForce 3 gaming PC for only $300.
I could build an XBox PC? With the same specifications?
Hi Phil,
The Xbox used a 733 MHz Pentium III, GeForce 3 GPU, 64 MB RAM, NT kernel from Windows 2000, 8-10GB hard-drive, DVD-ROM, native 5.1 Dolby Digital Live/Surround encoding, and 480p, 720p, 1080i support over component video.
A PC with the same hardware would be no match in performance, but it would be interesting to see what hardware it would take to be on par with the OG Xbox.
Full technical specs here:
en.wikipedia.org/wiki/Xbox_technical_specifications
Nice. Just got to find a few games that got releases for both and this could be a fun project.
I'd say the closest match to the Xbox's GPU would actually be a GeForce4 Ti (any) underclocked to 233/200 (core/memory).
You will need at least 1 GHz CPU to get XBOX performance level, or preferably 1,5 GHz. Windows kernel from Xbox was designed for game performance in mind, and DirectX implementation was fully integrated with GPU.
On consoles in general it takes way less CPU cycles to process graphics data to GPU.
It's the case even with upcoming Xbox One X. Microsoft did full hardware implementation of DirectX 12, so it's possible that Scorpio Core could outperform fastest intel Core CPUs in terms of DirectX API.
Thanks a lot ,Prof
Very nice and detailed as always! Love the timecodes :)
Came here because I was wondering what that yellow connector was for, so thanks for explaining!
The card I got in order to play Max Payne. One of my favorite graphics cards ever, behind my Voodoo3 and my current GTX1080Ti
Nice video. Knew the original xbox had a Nvidia GPU but didn't know what series. Learned something new today. Thank you!
Quits playing to Watch Phil
I never played splinter cell on pc. I was blown away by the graphics of the game when i played it on the xbox. i had never seen lighting and shadows like that before.
Oh and the story sucked me in too. Mostly because of what was happening in the news at that time
It is indeed a great game :)
Which news are you referring to that connect to the Splinter Cell story?
You just wrote a whole load of bullshit and I absolutely don't know why! :-(
armorgeddon sry took you as a troll. It's all love here in the computer lab. Could be armorgedin name though.?
Well, one thing still about the Radeon cards for the DX8/8.1 era, is the fact that they cannot do trilinear texture filtering atop of anisotropic texture filtering. In such a case, that is where the GF3 has an advantage. So filtering is much better than the 8500/9000/92x0.
Did not know that, so thank you for pointing this out. Will come in handy when I review the 8500 :)
PhilsComputerLab Anytime, mate! I've read a lot of reviews of it back then and now. The AF overall was comparable to the GF3's higher levels, but the mipmapping when colourized (yes, a Yank like me sometimes prefers that spelling) shows the mipmaps are not smooth gradients on the Radeon.
AF was hardly usable before angle-based optimizations (you don't need 16X AF on triangles that are facing you and on triangles that are just a little angled you need less than the full 16X). The slight reduction in quality came from not correctly categorizing triangles that were off-axis (i.e. not a wall, floor or ceiling, but e.g. a 30 degree slope viewed from the side); in those circumstances you got less filtering than you asked for (that's what those "lobes" in the coloured mipmaps on the inside of that tube are about).
I took several generations after this until both nVidia and AMD did these optimizations correctly. For a while Nvidia got into hot water for "brilinear" filtering. That's when you use bilinear filtering near the sweet spot and compress the trilinear filtering into a small interval where you need to transition from one mipmap to another.
I think I remember that from Maximum PC, Tom's Hardware, [H]ardOCP, and a few other sites back then.
I wonder if later drivers changed any of this? I couldn't find articles re-visiting these cards. Once the 9700 was out, the 8500 was forgotten.
Always wanted the Ti500. Programmable pixel and vertex shaders were a game changer. I remember seeing the 3dmark 2001 and GeForce 3 demos and being blown away.
It wasn't the slam dunk nVidia hoped for, as the XBox variant of the chip had many production issues. Which is why even to this day you don't see NV chips in XBox consoles.
It was a ground breaking architecture, though, no doubt about it.
And wow, those Radeon results are insane now. The picture was so different back then.
Wow, I pulled exactly that Medion PC from the scrap metal box at work. And was just looking for a review of that card :D I hope everything is still working.
One of my favorite 3d cards. I played Freelancer to the point of obsession back in the day and this card made it sing along with the P3 1.0 gig I had it paired with. Still have it, although I think it's shot as I get nothing but artifacts when I boot up with it. Excellent review!
Immediately or after a while? Thermal paste can be all dried up and then you'll get artifacts after a little while. If it's noise and clutter from the start, probably one of the memories is damaged.
soylentgreenb I will have to install it and observe again. But I think it was right off. I also have a voodoo 3 2000 that give me lines the instant I start it up. I even tried the oven method (nothing to lose ha) and still bad. But that is a good suggestion on the thermal paste. I will have to check all my old graphic cards now just as a precaution! Thanks!
This was my first GeForce card after Voodoo5, it has significant improvements on image and video output quality, a lot better than GeForce 2MX. Also with excellent driver compatibility with games, I keep using nVidia video card up to now.
Hi brother, i love your vids, you're very much into the stuff i was into. I actually came onboard with graphics cards SLIGHTLY late. I just missed the Geforce 256 launch. (My first computer was a Pentium III 600MHz with onboard graphics). The first actual video card i bought was a Riva TNT 2 32MB card. My friend had bought the 16mb SD card VooDoo 3 200. Even though on paper my TNT2 whooped it's ass, the VooDoo blew it away. Remember trying to play Desu Ex without Glide from the VooDoo? I broke of a capacitor and pretended it didn't work, and as i knew the guy who owned the shop, he gave me a voodoo 2 instead haha.
My next big purchase was an AMD Athlon 1300 (i think it was the 1300) with the "AXIA" Stepping. I found out the L1 and L2 bridges and i actually joined them with a pencil lead line allowing it to be unlocked! and i was able to overclock that chip from 1.3Ghz up to 1.6GHz on air, which was INSANE back then to get a 300MHz overclock. Axia was FAMOUS, one of the first true overclocking cards that got everyone addicted.
My next purchase was i think the Gainward nVidia Geforce 2 GTS. And then i bought an ELSA Gladiac Geforce 2 ULTRA (Man i can't find any pictures of it anymore and the card was sold with my pold machine years ago) but the card was THE most beautiful thing i've ever seen, the gladiac Ultra has a dark blue PCB, and GLOW IN THE DARK GPU cooler and heatsinks... i can't find a picture of it anymore... damn man it was beautiful. Check it out, Elsa (german company) Gladiac Geforce 2 Ultra 64MB DDR
Man I so remember those tech demos.The sultan one always creeped me out.I was using a riva tnt 2 in those days.
That feel when you realise that games like splinter cell and half life are considered retro nowadays. I'm old...
Join the club...
This was the first card I purchased. Had to get the family PC to run that GTA 3 :D
The Radeon cards had dedicated T&L unit and a vertex shader and they were able to use them both in 3DMark 2001 for transform and lighting. While the GeForce3 had only a vertex shader and used a vertex shader code to simulate the T&L unit.
Phil I really hope that when it comes to reviewing FX5200 that you get the 128 bit mem version, because nowadays I've noticed that on every review or test out there they're using the crippled 64 bit version which is significantly slower than the basic stock one.
Good point.
Wow, another Medion GF 3 Ti 200.
Got one not long ago for my retro rig, they are surprisingly cheap even today.
Only thing I did to it was adding a fan header to the two solder mounts on the PCB (halfway between the end of the AGP port and the PCI slot cover)
Teh fastest drivers for the Geforce 3 are actually of the 4x series, especially the 45.23
Merry Xmas Phil, I tried the first Win98 driver for my Gf3 ti 200 (v8.05 I believe) and the lowest Anti Aliasing available was 1x2. The games looked a lil better without any noticeable difference in performance. I'm disappointed that this option was removed from the later drivers esp for geforce4 & Fx. 2x AA or higher always makes my games sluggish.
I wanted this so much in 2001, and I was so close to get it...
I used to have one of the ti 200's. Handled unreal tournament 2004 well.
Lol I remember when I was 8 and trying to get my pc to work with minecraft, then this is the first GPU I had in a PC, lasted me another 2-3 years before I got a newer pc, the card was brilliant, for the pc that I had been using since 2007 it was great, had the 128mb siluro variant, very cool card with the black pcb
I have a 128mb version.
I've just ordered this card from ebay! ^_^ Great video! :D I wonder if I would be able to play Oblivion with it :) :)
I actually have an old Gainward GeForce3 Ti 200 sat under my desk which came out of my dad's old PC. I've tried to get that thing to run dozens of times, but I think the card is either dead or bricked. I tried a number of different things to diagnose the fault/issue, but with no success.
Great card for s370 P!!! 1.0Ghz/133fsb coppermine or P!!! 1.4Ghz/133fsb tualatin systems.
I have three of these Medion cards here. Unfortunately, the electrolytic capacitors are slowly breaking, some are arched, others have already expired. Will soon install one in the Athlon Thunderbird with 1.4 GHz. But first I have to replace the capacitors.
That's good, they are worth keeping and excellent for Windows 98, ME and early XP.
Last year I bought a Suma GeForce 3 Ti 200 for $3 USD, paired with Duron + ECS K7VZA, now the mobo have gone to pc heaven...
Hi Phil! Amazing video, as usual.
I also got mine GeForce3 from electromyne:-) its the exact card.. Do you think i should put a fan on it?
My Pentium 4 pc is soon finished, just the cpu fan and graphic card i still wait for, it will be a win 98 natural
I think if you're overclocking a fan could help. But it's not needed for normal speed.
PhilsComputerLab okay thanks
Speaking of medion. I shure hope you will have a chance to review the Radeon 9800 XXL card from medion.
Those were the times when a simple small card with just a small passive heatsink could play any game easily
I had this one :) !
Hey @PhilsComputerLab what driver version do you recommend for this card under windows 98? thank you!
Is it true that the GeForce3's "pixel shaders" were pretty much register combiners?
Hey phil, what os do you recommend for a pentium pro 200mhz? im planning to build a classic gaming rig with it and use a sound blaster live and 8mb ati 3d card. I was thinking 98 or 2000
I use 98 most of the time and it works great.
i have brand new in box sealed a rare pentium 2 overdrive 333mhz. Dont know if i should open it or not for a build lol
Hmm I wouldn't :)
This was the first gpu I ever bought.
Wait, shaders weren't a thing back then?
In original splinter cell you can use DGvoodo2 emulate Geforce 3 or geforce 4 gpu's and the game correctly displays all shadows
Hello Phil, maybe you can use 1T timing for the AMD, i have the same mobo, you can adjust this through bios-2T hampers AMD performance even from the Athlon XP era. Medion cards use 500mhz DDR memory, please check if you can get these to 500 mhz. I can't get mine although it is rated by manufacturer to 500 mhz, is this strange or not? Any ideas? The card runs very hot with the passive cooling.
I know, I had 1T initially, you can see it in the build video. But then I decided to just use BIOS defaults. Sometimes the machine resets CMOS, so I just load the defaults and off I go. I got to have it consistent so that's why :)
I see, check the ram of the card if you can too. Thanks.
Man, you're right. 4ns RAM. Bugger, that would have been something worth mentioning in the video, or attempting Ti 500 OC.
Let me know if the ti500 oc will work for you. Core should be "easy" with some active cooling, the ram was always the hard one. Mine does not work but the card has some bad caps on it, maybe it is for this reason..i will replace these and let you know. In general the case of medion ti200 is interesting. Keep up the good work with the videos :)
great video
Where did you get HL2:CE? From XJR9000 channel?
Nah a friendly viewer wanted me to start testing HL2 and hooked me up.
Phil. Do you have any interest in an AMD semperon 2200+ ?
I'd give it away to you, you'll just have to pay for shipping
Oh fuck. Forget it. I have thought you are British. Dat accent. No. Just forget it. There is no way it would be reasonable to ship dat thing to Australia.
All good, thanks for the offer. Electromyne have so many CPUs, I can ask them for a review unit if I need one.
Sondelschrottis, that accent sounds to me like an german one. And the ALDI-Screenshot was in german, too.
Phil, did you live in germany?
austria
Most likely germany or nearby country, since aldi is a europe mostly shop, and when that pc was on the market(2001) it was most likely in germany/denmark/austria.
OoO! I member playing Return to castle Wolfenstein christmas 2001 on a Gf3 Ti 200! :)
That's strange, I used to get more FPS with my FX5200 on Halfe Life 2 and FX5200 is pretty much equal to Ti200
Can you do a review on the 6800GS?
I don't think I have one.
How come unreal tournament 99 has framerate drops on a 933 mhz pentium iii with this graphics card?
this card can be flashed to work in a g3/g4 mac
Does anyone have this card? I have one that's missing capacitors, and I cant find any info online.
Phil, go and learn about EEPROMS (i2c and spi ones), get eeprom programmer and read eeprom from gpu, maybe pci device id is stored in eeprom and you can change it to try to change it to ti 500 :D
There is a chip on the card, and through resistors voltages can be changed, which will set the device ID. This is documented with some cards, but a lot of that information is gone. I was hoping someone would know and post some information, but it seems to be super rare knowledge.
so just a pull up/down configuration on gpio. what is the chip i can read datasheet if it exists.
etutorials.org/Misc/pc+hardware+tuning+acceleration/Chapter+14+Hardware+Acceleration+of+Video+Adapters/Modifying+and+Overclocking+GeForce3+Ti500+Ti200/
on buck converter u can modify feedback increase voltage
holy shit this is soo simple
you can modify feedback divider (ref voltage 1.25v) and set duty cycle bit more to give it higher voltage but i dont have card on hand to see how to mod pci id.
Sounds like this is more your thing. You should do it, seems this interests you.
where i can find the card?
Is this recommended for Retro PC phil?
I got one for a 933 MHz Pentium III, it's quite nice. Performs even flawlessly in Windows 98 SE DOS-mode.
+BeastGamer It entirely depends on what you're trying to do.
An FX 5700 or FX 5900 would be significantly faster and have almost the same compatibility; you can crank up the anisotropic filtering and it will look a lot better. Unless you have a fast CPU; like a pentium 4, athlon XP or athlon 64 it's going to be CPU bound and you're not going to benefit much from its speed. Is that what you want?
A 3DFX Voodoo 1 will run ancient 3D accelerated DOS games that are very difficult to run on more recent hardware; running anything more modern than Half-life will chug badly. It's "authentic", what these games would have really felt like back in the day ca 1996-1998. Is that what you want?
A 3DFX Voodoo 5 is the fastest voodoo card you can relatively easily get your hands on and it has a cult following like the Harley Davidson motorcycle. When the card was released most people kind of thought it was a piece of shit; just like the Harley. Good glide compatibility, about as fast as a geforce 256; quite expensive.
The geforce 3 is the first direct X 8 card from nVidia. It's quite fast for windows 98 gaming; AA is somewhat useable where as SSAA really wasn't even in quite ancient games. Anisotropic filtering is done correctly without optimization (i.e. if you want 16X, you're going to get 16X even where you don't need it), but that makes it slow to the point of being unusable (maybe 2-4x is OK). Is this what you want?
The geforce ti 4200 is probably cheaper and it's a little bit faster, but otherwise the same thing as a geforce 3.
The geforce 4 MX440 or geforce FX5200 is cheaper than dirt and plenty powerful for anything made before 2000. Except for AA and AF there's no need for anything faster if you're into late 90's gaming. These are available fanless.
Yea BeastGamer answered this well. I think for the price it's a great card to have and a ton of games work well.
It's getting very hard to find the Geforce3Ti 200 guys.
Checkout the part where I check on eBay. I found good stock and fair prices.
That is so frustrating. I have a Radeon x850 Pro running on an AthlonXP 2400+ and you still have more FPS in Half Life 2 with your Ti 200...
maybe he isn't running the steam version wich works wit the open gl api
you should try the D3D api version
But yours runs in DirectX 9, and likely the newer Steam version with upgrades.
I have the original release from 2004 which should be running on DirectX9? I only get 8500 points in the 3DMark 2000 (No matter what Resolution and Lvl of Details i have). His GF2 GTS is faster than mine ^^
3DMark2000 is not that reliable with newer cards. Try 2001. Also in HL2, can you switch to DX8 render mode like I did in the video?
was this the graphics for the original xbox
Yes. Xbox had something similar to a Geforce 3 Ti.
The Xbox's GPU performance is comparable to a GeForce 3 Ti500.
Nvidia was almost always second best in terms of raw performance except the 8000 series gpus if I racall corectly. They just had better drivers and PR machine. Sucks the world works like this.
ATi Radion 8x00 and 9x00 cards where amazing back then
No Halo? :(
TheSynrgy1987 I've tried Halo on my TI200 before but forget how well it actually performed...kind of want to see what it does now.
I got a Pentium 3 866mhz, ASUS CUV4X-C, 256 mb ram, and a Geforce3 Ti 200. OS is Win 98 SE. I can not run it in AGP 4x. Drivers says only 2x. Nothing helped. Wether the latest chipset (VIA VT82C694X) drivers nor the latest Nvidia drivers. The latest Nvidia drivers only caused problems, so I installed 56.64. BIOS is on the latest non Beta version. Somebody knows about the problem? I know there is no difference between 4x and 2x and that the CPU will hold my GPU back anyways, but I can not sleep anymore. Btw nice video.
Greetings from Germany :D
Yes this is an issue with VIA chipset. It isn't stable at 4x, so to protect the system, the NV driver will switch to 2x if it detects VIA chipset. You can switch to Radeon card, it will run at 4x but you might see crashes. If your BIOS has AGP strength option, you can play around with that and get the Radeon stable. But yea on a Pentium III system I recommend Intel 815 chipset board.
Okay thanks a lot for your fast reply. :)
interesting
can it run Resident Evil 1?
I'm not familiar with that game, sounds a bit old for this card?
PhilsComputerLab
oh ok. because this game from win 95
electro myne are extortionate prices!
€60 euro for a Geforce 9800gt = £54 british pounds to me... £54!!!! for a 9800gt! thats available on ebay for around £15-£20!!!
Gamerdude Tech Buyer beware, lots of stuff is cheap, others isn't. You got to consider all your options.
To bad its a medion:P
But nice video.
Ez OC
real PCGAMERS play Doom 3 with a 15fps average
This was one of the worst