You know what? You have a damn good point, and I probably should have tested it in this video. But while I still have the card here, better late than never! So here's the results... I used the same settings as my 5800 Ultra vs 9700 Pro video (th-cam.com/video/qN4O2IqlS84/w-d-xo.html), which is 1024x768, mostly max except volumetric lights turned off, and light detail is set to minimum. (graph image: i.imgur.com/aeqoqnK.png) ▶FEAR Results: *GammaChrome S18 Pro:* 26.8fps AVG (9fps 1% lows) *GeForce PCX 5750:* 9.6fps AVG (4fps 1% lows) *Radeon X600XT:* 28.4fps AVG (10fps 1% lows) *Radeon X700 Pro:* 59.7fps AVG (23fps 1% lows) Analysis -- The GammaChrome S18 Pro proves once again that its shader performance is head and shoulders above the FX architecture, performing right on par with the X600XT. With a few tweaks this game would have been very playable for gamers at the time, and is honestly one of the more impressive showings for the S3 card! Thank you @Halon1234 for prompting this comparison!
I really appreciate you going to the trouble. If it handled that without errors, I’ll give S3’s driver team kudos. The 5750’s numbers look about like I’d expect - as the former owner of a 5900XT, that’s almost nostalgic.
@@Halon1234 Still have my 5900XT resting in the original box, didn't even open the dongles that came with it. Otherwise it was quite an underperformer but a nostalgic piece of history.
Wow! Amazing job with this review, Nathan. I honestly don’t think anyone could’ve done a better job. This card is really underwhelming and has no real significance, but I’m guessing that’s what you love most about it 😊 It’s kinda crazy this is the first review nearly 20 years after its launch. Man, I miss making videos…
Thank you Mike for EVERYTHING. Even if the results aren't exciting, it feels like a stone left unturned that needed addressing. Too bad after 20 years I didn't have any good news to share, lol.
Thank you for doing this! Just because some product fell short of expectations or promises and not alot of it was sold doesnt mean it deservs to be forgotten. If we really care about history of GPUs it's important to cover all of them and tell a story of every obscure model.
I agree. There's also still plenty of video cards in the annal of the weird and obscure computing history out there waiting to be researched, which PixelPipes videos have done a good job of.
It would have been really cool to see S3 Graphics on the market today. Unfortunately, like many smaller tech companies, they ended up being acquired by a Chinese company. BTW, that's an awesome cover on this rare gem of a video card. I wonder how many gems like this Mike has? 🤭
Love to F2F and his family, cannot begrudge him for dipping out given how it is these days. One hell of a loaner given these were rare as hens teeth back in the day.
We missed you and we miss F2F too! Awesome collab from "beyond the grave" and great beauty shots of the card! You gave it more love and attention than S3 ever did! Looking forward to whatever you're working on next, even if it's about something no one was interested in then, because your audience is DEFINITELY interested now!
sometimes it really is a matter of relegating yourself to a project, letting people give it a go, eyeing the competition, and then completely dropping everything because you have a piece of crap that cannot and should not be widely available. also ALSO you have an extremely good channel and i like your homegrown ability to disseminate information, talk about stuff, and acknowledge the utter nothing burger phenomena that some pieces of stepladder tech are. hahaha what a colossal hunk of shit, excellent video
That S3 ever made anything at all with the PCIe interface is news to me! This might not have been the most interesting product for you to review but I sure found it interesting! Thanks!
@@greenmoose_ they even made a much better card after that, the Chrome S27. It had the highest chip clock at that time and competed relatively well in the midrange class. I luckily own one of these - sadly not two, because you can use them together in SLI aka. Multichrome.
@@WuDMatthauser I had an S27 and it was roughly equivalent to a Radeon 9600 or 9600 Pro. I played Far Cry on it and it ran fairly well at medium high settings. Very reasonable card for its time.
The reason VIA technologies brought out S3 was for their integrated line of chipsets. In order to get sales in the OEM/ODM market, Onboard graphics was a must. Intel had Onboard graphics for their Pentium II/III line of CPUs with the I810 chipsets, while VIA really had nothing in terms of fully integrated chipsets, they would have to rely on a motherboard manufacturer to put a discrete GPU directly on the motherboard, resulting in far higher cost of boards, resulting in being snubbed by OEM/s. With the purchase of S3, VIA was able to use the Savage4 GPU core and integrate it into the Northbridge of certain VIA chipsets (the VIA Apollo MVP4 for instance). This also had a knock on affect. VIA were the first company to introduce and ratify the MiniITX standard with their line of VIA EPIA motherboards, these boards included a CPU, chipset and all the features (minus memory) to produce a very lost cost system, these EPIA boards came with chipsets that included S3 Savage/Delta iGPUS
Always good to see an upload from you Nathan. Myself and likely most others only have experience with S3 in the OEM space so this was quite the learning experience.
I had a lot of fun around 1999-2000ish with the S3 Diamond Viper 2. it was exceptional for Unreal Tournament and a good bang for the buck card. It also handled 32 bit color really well.
Thanks for finally showing the world what this card can do, even if nobody remembers it. Kudos to F2F for loaning it too, I miss his content so much! Ever since that Acquired Tech episode I was wondering when a review of this card would see the light of day, and what a better way than under the PixelPipes microscope!
PIXELPIPES!!! YEAHHH!!!! Thanks for another GPU deep dive! Even if it wasn't an interesting GPU, it's interesting because YOU covered it! I'm just amazed that S3 managed to make a D3D9-capable architecture in the first place 😆
Nice rundown on S3 history. The first PC I built (1995) had a Diamond Stealth 64 VRAM 4MB PCI, which had the S3 Vision968. Toss up between that and the Matrox Millennium we had in our main PC for the best 2D only consumer video card ever made. I put it in what was considered a SFF build at the time - used a late production socket 3 intel integrated I/O baby AT board in a rather short (for the time) mini-tower, at a time when flat cases were the norm for consumer PCs. It was a little bigger than some of the larger ITX builds you see today. I bought an intel 486DX2-66 for it, along with one of those Creative Labs bundles with a 4x CD-ROM and SB16. A couple months later, a BIOS update added multiplier overclocking (I think FSB was already changeable in BIOS when I built it). I ended up running that CPU at 120MHz with a 40MHz FSB, effectively turning an intel DX2-66 into an Am486 DX4-120, lol. It performed like a P75 in integer math and like a P60 in FP. So Quake ran decently on it, something it would struggle with at its stock clocks. I know I have a S3 Virge based card around here somewhere but my first 3D card was a RIVA TNT. That's the card that started it all for Nvidia. It was the first affordable 3D card that was also good. It was launched towards the end of March, 1998 for $199 and, by the summer, I was able to get one for $149, a massive difference from modern Nvidia. I can't imagine they netted a whole lot of profit on that GPU but it was a good business decision in the long run. It put what is now the worlds largest GPU maker on the map.
I love these reviews of obscure graphics cards, I find them very interesting and entertaining, they are interesting in and of themselves simply be u I have never heard of this card.
Interesting card. GPU hardware was developing at an astonishing rate back then. Nothing stayed top dog for long. Most low and mid range cards came and went without much fanfare. I'm guessing that given the sheer number of hardware combinations, games, DX versions, resolutions, dithering and filtering options available, there was some sweet spot where the S3 would take the top spot for 5 minutes. 😉
I had the Savage 3D purely because I played UT99 to death and back in the day UT had S3 metal option and the textures were great compared to the OGL and DX renderers
Awesome video, I sold my S3 collection to buy 2x CHA001 Centaur Technology motherboards. I had S25s, S27s, a 530 GT, 540 GTX and a whole shew of 5400EW cards. I could see the progression and then the stall in performance. If you find a UH4/UH8/UD8/5400EW on the Internet, then it's probably the one I got obtained by emailing a bunch of Video Wall makers, lol. They really like to hold their drivers close to their heart. The 5400EW's device ID slots nicely into the Chrome 400/500 drivers. I was able to use the 5400EW with Chrome drivers, but there is no advantage in doing so. The last 5400EW drivers are dated 2016, so it's weird to see them use the same product since 2009. The 5400EW is clocked a bit faster than the Chrome 645/640, and can slightly beat it in gaming. I've been messing around with a Zhaoxin c960 and it still has the same problems of the 5400ew and Chrome 645/640 bad directx 10 support and very poor OpenGL performance.
Bad API support is often fixable if one looks at the hardware side a bit in a different way than basic driver devs did, so instead of just mapping function to function so to speak you look at the clusters of operations in terms of what they are used for, often you can either cut out function calls or straight replace them once you understand what a specific cluster is used for in terms of what the aim is. I don't remember any examples off the top of my head but there are multiple documents, like whitepapers, that go into for example Dx9 and Dx10 function calls in clusters in the stack, I mean sort of like MADD optimization but for more complex calls and in more heterogeneous streams. like op1, op2, op3 repeating can be subbed for just op1 op2 and op3 called only every 5 op1 2 pairs, if that makes sense. It depends a bit on how deep he stack is and what is kept in cache, but there may be a lot of performance to be gained from this sort of analysis and removing redundant calls. I mean performance gains in the order of 20-50% for certain clusters of calls. Both nVidia and AMD did similar things a few times, in particular going from the first driver releases for HD2900 series and 8800 series to the drivers released after the newer gen chips after had shipped. OpenGL is usually less 'open' to this sort of thing, but directX very much is.
Awesome video mate nice to see some finally did a review on it I was approached by my supplier back in those days to try one but I never did.Used to run a big internet Cafe in Turkey but I stuck with my Ati cards. Been using ati and amd for the last 35 years and never looked back.Keep up the great work love your content
They also made the Chrome 400 and 500 series going by information on wikipedia. There is only a few low quality videoes about them on youtube. Would be intresting to cover as they are DX10 capable cards.
Longtime fan. Glad to see you're still making videos. I had an S3 Savage4 Pro for a while in 99/00 but replaced it with a Creative Labs Geforce2 GTS in summer 2000 and never looked back.
That must have been a huge upgrade, Geforce 2 GTS was a beast at the time. I had an S3 ProSavage integrated chip (integrated Savage4) for a while, and it was a really weak graphics card, even back in the day. It struggled to run basically anything 3D, and I had to mostly use a low resolution of 640x480, when most gamers were already using 1024x768 or higher. I remember being frustrated by that.
Most people don't remember this because it was a long time ago in the world is was different back then, but you couldn't just hop on Amazon and pick up whatever the best thing was back then, you basically went to the store and you got what they had. So I picked up an S3 Savage 4, and thank goodness it was not great but not terrible because that's what I got
I never thought S3 would go on after the DeltaChrome and IX - those were the latest S3 chips I ever saw in the wild. My very first computer of my own had the ProChrome chipset built into the motherboard. It really sucked (outperformed a lot by a geforce 2 mx400), but was pretty fun to experiment with (S3TC and so on). Suffice to say, it was quickly upgraded - can't have something weaker than the one in daddy's PC 😀
This is excellent! I always wondered what happened with those cards. Even with the middling performance, it's a shame they weren't reviewed upon release. But if you factor in the lack of board partner support, having to purchase them directly and their incredibly tardy introduction, it does make much more sense!
VDesk is what is called Workspaces in the Linux world at the time, seems like an interesting code borrowing since the limit is 4 which was also the standard back then as well for Linux.
Even though they weren't doing well, they have experimented with bleeding edge technology that has eventually found their way into recent video processors from AMD, and finally, NVIDIA, as S3 apparently were experimenting with superscalar out-of-order vector units to see how they can get significant performance boost outta it (easy answer; SIGNIFICANTLY, that recent GPUs sporting superscalar out-of-order shaders are far more powerful than in-order counterparts).
Öhh what? Have I walked into another world by mistake? a PixelPipes video released today? must enjoy with some coffee and contemplate the Mandela effect I am experiencing now when watching a PixelPipes video :D
Also: That sound that sounds like a sick cat noise at 20:00 and again at 20:04 & 20:08 had me looking around to see if it was my cat making noises. Probably most people won't notice but I have sensitive ears and headphones and I can clearly hear it so much that I had to back-skip the video and re-listen again. It's definitely there in the audio.
I think Savage 4 was the last "card" of theirs I used. Then there was the "pro savage" igp. I know there's the S3 Chrome 430GT which has dx10.1 support with 32 shaders, 4 pipes, 4 rops. 256mb ddr2 memory. Oh and the S3 Chrome 540 GTX
i remember the savage 3d in the first pc i build back in the days...but after things like the voodoo3, geforce 256 and co never heard anything about them
I've been looking for one of these graphics cards for years just on the premise that it seems to exist only in wiki lists of S3/VIA cards that supposedly existed. That alone peaks my interest. These are the exact tests I wanted to run too. Thanks for giving closure on a card that probably isn't actually worth seeking out unless you're trying to start a museum of obsolescence. Which I totally would do given the funding.
thanks for the review, i've read somewhere that the Gammachrome had some hardware bugs in the memory interface, which they fixed in the later Chrome S25/S27. Sadly you could not buy any of the cards back then, imported one Chrome S27 from the US to Germany in 2006 and it's still alive with some S25 and 430GT and 440GTX (DX10.1) cards. 😅 S3 Fanboy back then. I hoped they would do one lucky punch to stay in the business, but VIA does not seem to have interest in it
Actually, I loved my Savage 4 16MB VGA. Unreal looked awesome on it and the performance was really good for the price and features, in my humble opinion that I greatly respect. I still regret not getting a Savage 2000 about 20 years ago (when it was dirt cheap after its faliure) for my collection. Somehow, at that time I was giving away to my family and friends every piece of last gen. hardware without ever looking back. The only thing I kept was a Pentium MMX 233 from 1997.
Thanks for that review! I honestly thought S3 3D cards (not integrated solutions) died with the Savage 2000, so this is an interesting learning! And I'm surprised that they could at least compete with some mid range solutions (and was baffled you could find drivers for it). I had a Trio64, a good 2D performer, and also a Savage4, which was probably in the same league - not too fast, but also not slow and capable enough for most games. It's a pity they rushed the S2000 (broken T&L) and ultimately failed to compete, despite having such a large market share in 2D and early 3D times...
I had a Manli S3 Savage 4 card 32MB sdram, it was the first card I ever bought, it was much cheaper than a Geforce 2 MX 32MB a schoolmate of mine had and I thought I was more clever than him that bought a 3D card with 32MB VRAM as his for almost half the price! Little did I know back then... The card was just weak, drivers a pain and very simple, and when I paired it with a Celeron 800Mhz cpu it needed 5 minutes to show the Vbios on screen for some weird reason... It is the only card I threw away after I bought my Kyro II 64MB which was a dream compared to that crap and very good overall. I didn't even want to sell it or give it to a friend or relative I just threw it to trash. Happy to see you back! Very nice video for a mediocre card but one I had never heard before, Cheers from Greece!
It is quite possible that it was a 32bit card (POST times come down to the motherboard - have one that behaves the same with any card). Recently aquired a 32MB Savage4 with a 32bit bus...and performance is aweful. Even a Pentium II 350 isn't a bottleneck it is that slow. My full 64bit 16MB Savage4 (same manufacturer as the other, identical PCB) on the other had trades blows with a Riva TNT in some games, TNT2 Vanta in others. Barely scales beyond the Celeron 500 however - swapping it for a PIII 1000 saw only a couple of FPS improvement. The Vanta however thrashes the Savage4 with the PIII 1000. Might at some point trial it with the Celeron 633 that came with one of my boards just to see how it performs - never put it in a working board however so no idea if it is functioning.
Man even so now I want to get one myself as a collector of rare cards this makes my hardware Hart go wild thanks so much for doing this one i realy feel i need to get one so good work mate my passion is lid up ;)
Anisotropy in OpenGL is a floating-point number. So you found a weird gem that allows non-power-of-two levels to be forced in the driver, but you still haven't seen anything expose the -full- range of anisotropy :)
I kind of remember the s3 webshop, as at that time I was on the market for a.graphics card, and my previous card was a Savage4, so I thought: why not buy the next s3? but there were two big problems: - the price was surprisingly high compared to other products at that time - it was not available for shipping from europe, and shipping from us at that time was more difficult than nowadays. I always liked quirky hardware, so I was a bit sad that there was no option for me to buy one.
The Trio 64v+ was legendary back in the day, so much that it was emulated in virtual machine software. Shame that their 3D offerings paled in comparison though they did have the last laugh with S3TC compression.
PixelPipes is here again. We missed you and your great videos. Recapping is mandatory with many retro stuff. I have recapped with great effort my Gigabyte GA-7N400 Pro2 s462 MoBo with quality japanese solid caps and it runs like a charm.
I was reading about older cards a while ago. When I was a kid, a friend of mine had a Savage4 paired with a K6-2 and it didn't perform as expected. Compared with the TnT 64 M2 and a similar Celeron, it sometimes took a beating. From reading about these on Anandtech, it wasn't that the card was bad, it was a driver thing. It seems K6-2s were decent for gaming if you used Voodoos, as they had 3DNow driver support. When you paired the K6-2s with Nvidia or S3, things were not good. Details, details... Unfortunately, I never saw Savage class cards running on Intel but they seem to have been good enough. After that though, not really...
Really like the fan shroud design its a same it doesnt perform as well as it looks, maybe the reason you got x2 coolers was to swap the sticker over from the new one to the old one.
Should have brought in a 6 series nvidia card, since it was the generational competitor to the ati x series. I remember having a 6600gt at the time, was a great card.
I don't think characterisation that the classic Virge is slower than software rendering is entirely fair to the hardware, though it is fair towards the product. The driver overhead was high, driver compatibility was bad, but also... you end up comparing 640x480 hardware accelerated rendering to 320x200 software rendering, trying to paint 5 times as many pixels! If it was possible to use compromise resolutions such as 512x384 or 400x300 in S3D games, it would have made a much nicer impression. If there were more than 3 S3D games. It's interesting to see this one. It's not too surprising to see drivers being slow and buggy for a card that didn't end up releasing for real, but it may have been one of the things that doomed it, alongside delayed release.
During that period of time, the CEO of VIA was touting how much cheaper their stuff is and how small a die area it takes up. I think this is why the chip got a massive cut in the pipelines. The husband and wife team of VIA CEO (wife is also CEO of HTC, yeah, THAT HTC) likely were never in it for the business to make money, but to make money from the company stock. Long term investors in their stock were usually screwed in the end.
I mean its better than an geforce MX, but like TNT's and voodoo 3's those were damn near free ... having to web order it against a Geforce FX available almost free its a hard sale ... but for an S3 card its not terrible
I can't help but wonder if they had yield issues with the 8 pipe chip design and decided to cut down to 4 in the final hour to salvage silicon. It would have been interesting to see how the cards would have performed with the full 8 pipes.
I am not sure what card it was I had, but I am sure I recall having an AMD card with an AGP / PCIe bridge chip, it might even have bridged a newer PCIe format GPU to the older AGP standard.
been watching for years, you're really good at this, the channel deserves more subs but I guess the niche is too small, perhaps branch out into other channels/types of reviews in the computing space?
The amount of disappointment, and frustration, via is known for, makes me wonder _how_ a consistently bad company is still allowed to continue making said products to a market long since apathetic to their shenanigans. Side bar: so glad to see you are back, my dude! 😎
Some of their earlier products and late 2000's stuff were genuinely pretty good but yeah... they were mostly inconsistently on the bad side of products.
This is all valid - but how does it run FEAR?
and can it run Crysis? xD
You know what? You have a damn good point, and I probably should have tested it in this video. But while I still have the card here, better late than never! So here's the results...
I used the same settings as my 5800 Ultra vs 9700 Pro video (th-cam.com/video/qN4O2IqlS84/w-d-xo.html), which is 1024x768, mostly max except volumetric lights turned off, and light detail is set to minimum.
(graph image: i.imgur.com/aeqoqnK.png)
▶FEAR Results:
*GammaChrome S18 Pro:* 26.8fps AVG (9fps 1% lows)
*GeForce PCX 5750:* 9.6fps AVG (4fps 1% lows)
*Radeon X600XT:* 28.4fps AVG (10fps 1% lows)
*Radeon X700 Pro:* 59.7fps AVG (23fps 1% lows)
Analysis -- The GammaChrome S18 Pro proves once again that its shader performance is head and shoulders above the FX architecture, performing right on par with the X600XT. With a few tweaks this game would have been very playable for gamers at the time, and is honestly one of the more impressive showings for the S3 card! Thank you @Halon1234 for prompting this comparison!
@@PixelPipes hey cool of you to do this
I really appreciate you going to the trouble. If it handled that without errors, I’ll give S3’s driver team kudos. The 5750’s numbers look about like I’d expect - as the former owner of a 5900XT, that’s almost nostalgic.
@@Halon1234 Still have my 5900XT resting in the original box, didn't even open the dongles that came with it. Otherwise it was quite an underperformer but a nostalgic piece of history.
Fascinating card. Not good, but fascinating. I learned something today, great video!
Wow! Amazing job with this review, Nathan. I honestly don’t think anyone could’ve done a better job. This card is really underwhelming and has no real significance, but I’m guessing that’s what you love most about it 😊
It’s kinda crazy this is the first review nearly 20 years after its launch.
Man, I miss making videos…
Thank you Mike for EVERYTHING. Even if the results aren't exciting, it feels like a stone left unturned that needed addressing. Too bad after 20 years I didn't have any good news to share, lol.
@@F2FTech so why not start again?
Think he can’t because he works for nvidia now if I remember correctly
Thank you for doing this! Just because some product fell short of expectations or promises and not alot of it was sold doesnt mean it deservs to be forgotten. If we really care about history of GPUs it's important to cover all of them and tell a story of every obscure model.
I agree. There's also still plenty of video cards in the annal of the weird and obscure computing history out there waiting to be researched, which PixelPipes videos have done a good job of.
Great Video and showing back in the day BEST Games back then...
I particularly like the product info and backstory you give in each of your video's Nathan. Very enjoyable to watch and learn.
It would have been really cool to see S3 Graphics on the market today. Unfortunately, like many smaller tech companies, they ended up being acquired by a Chinese company.
BTW, that's an awesome cover on this rare gem of a video card. I wonder how many gems like this Mike has? 🤭
Love to F2F and his family, cannot begrudge him for dipping out given how it is these days. One hell of a loaner given these were rare as hens teeth back in the day.
Right back at you 🥰
We missed you and we miss F2F too!
Awesome collab from "beyond the grave" and great beauty shots of the card! You gave it more love and attention than S3 ever did!
Looking forward to whatever you're working on next, even if it's about something no one was interested in then, because your audience is DEFINITELY interested now!
I’m still alive…I promise 😅
Thanks for the 🥰
@@F2FTech LFG!!! ❤️💚💙
Well well...never thought id see another video from you...awesome!!!!! Thanks a ton and happy to see a new video
sometimes it really is a matter of relegating yourself to a project, letting people give it a go, eyeing the competition, and then completely dropping everything because you have a piece of crap that cannot and should not be widely available. also ALSO you have an extremely good channel and i like your homegrown ability to disseminate information, talk about stuff, and acknowledge the utter nothing burger phenomena that some pieces of stepladder tech are. hahaha what a colossal hunk of shit, excellent video
The cooler is beautiful. I would definitely hang this on my wall as decoration.
That S3 ever made anything at all with the PCIe interface is news to me! This might not have been the most interesting product for you to review but I sure found it interesting! Thanks!
@@greenmoose_ they even made a much better card after that, the Chrome S27. It had the highest chip clock at that time and competed relatively well in the midrange class. I luckily own one of these - sadly not two, because you can use them together in SLI aka. Multichrome.
@@WuDMatthauser oh wow! I had no idea, you've got me looking for reviews of that now!
@@WuDMatthauser I had an S27 and it was roughly equivalent to a Radeon 9600 or 9600 Pro. I played Far Cry on it and it ran fairly well at medium high settings. Very reasonable card for its time.
S3TC was the shit back in the day, and still supported in some form today!
@@TheWarmotor Yep, surprisingly enough, most recent video cards, like Radeon DNA as well as GeForce Ada Lovelace video processors, still support it.
The reason VIA technologies brought out S3 was for their integrated line of chipsets. In order to get sales in the OEM/ODM market, Onboard graphics was a must. Intel had Onboard graphics for their Pentium II/III line of CPUs with the I810 chipsets, while VIA really had nothing in terms of fully integrated chipsets, they would have to rely on a motherboard manufacturer to put a discrete GPU directly on the motherboard, resulting in far higher cost of boards, resulting in being snubbed by OEM/s. With the purchase of S3, VIA was able to use the Savage4 GPU core and integrate it into the Northbridge of certain VIA chipsets (the VIA Apollo MVP4 for instance). This also had a knock on affect. VIA were the first company to introduce and ratify the MiniITX standard with their line of VIA EPIA motherboards, these boards included a CPU, chipset and all the features (minus memory) to produce a very lost cost system, these EPIA boards came with chipsets that included S3 Savage/Delta iGPUS
Wait a second... Is this a card from the legendary S3 manufacturer?
Legends ❤
I love your retro power build, nice job getting that E8600 stable @ 4.5Ghz.
Wow, I had no idea about these. Great video as always!
Oh cool - a new PixelPipes episode. I always enjoy what you put out.
Always good to see an upload from you Nathan.
Myself and likely most others only have experience with S3 in the OEM space so this was quite the learning experience.
I had a lot of fun around 1999-2000ish with the S3 Diamond Viper 2. it was exceptional for Unreal Tournament and a good bang for the buck card. It also handled 32 bit color really well.
Thanks for finally showing the world what this card can do, even if nobody remembers it. Kudos to F2F for loaning it too, I miss his content so much! Ever since that Acquired Tech episode I was wondering when a review of this card would see the light of day, and what a better way than under the PixelPipes microscope!
100% Agree. Nathan excels in every video he creates. Thanks for the love and look forward to your upcoming videos.
PIXELPIPES!!! YEAHHH!!!! Thanks for another GPU deep dive! Even if it wasn't an interesting GPU, it's interesting because YOU covered it! I'm just amazed that S3 managed to make a D3D9-capable architecture in the first place 😆
Aw thank you! Yeah very true, the shader performance isn't nearly as bad as expected, and a huge step up from XGI cards.
The video deserves a comment and to be seen, even though the card doesn't.
I am so happy to see you back. I hope I am not alone with wanting to see you more often on youtube.
Pixel Pipes posting on TH-cam makes my day. No matter how often we get the privilege.
Verry intresting, could have been a different story if they kept the 8 pipes from s8 and launched on time. Thx for taking the time for this review!
Nice rundown on S3 history.
The first PC I built (1995) had a Diamond Stealth 64 VRAM 4MB PCI, which had the S3 Vision968. Toss up between that and the Matrox Millennium we had in our main PC for the best 2D only consumer video card ever made.
I put it in what was considered a SFF build at the time - used a late production socket 3 intel integrated I/O baby AT board in a rather short (for the time) mini-tower, at a time when flat cases were the norm for consumer PCs. It was a little bigger than some of the larger ITX builds you see today. I bought an intel 486DX2-66 for it, along with one of those Creative Labs bundles with a 4x CD-ROM and SB16.
A couple months later, a BIOS update added multiplier overclocking (I think FSB was already changeable in BIOS when I built it). I ended up running that CPU at 120MHz with a 40MHz FSB, effectively turning an intel DX2-66 into an Am486 DX4-120, lol. It performed like a P75 in integer math and like a P60 in FP. So Quake ran decently on it, something it would struggle with at its stock clocks.
I know I have a S3 Virge based card around here somewhere but my first 3D card was a RIVA TNT. That's the card that started it all for Nvidia. It was the first affordable 3D card that was also good. It was launched towards the end of March, 1998 for $199 and, by the summer, I was able to get one for $149, a massive difference from modern Nvidia. I can't imagine they netted a whole lot of profit on that GPU but it was a good business decision in the long run. It put what is now the worlds largest GPU maker on the map.
I love these reviews of obscure graphics cards, I find them very interesting and entertaining, they are interesting in and of themselves simply be u I have never heard of this card.
Finally someone who still remember S3 Graphics!
What about Chrome 540 GTX
Interesting card. GPU hardware was developing at an astonishing rate back then. Nothing stayed top dog for long. Most low and mid range cards came and went without much fanfare. I'm guessing that given the sheer number of hardware combinations, games, DX versions, resolutions, dithering and filtering options available, there was some sweet spot where the S3 would take the top spot for 5 minutes. 😉
I had the Savage 3D purely because I played UT99 to death and back in the day UT had S3 metal option and the textures were great compared to the OGL and DX renderers
@@KirstenleeCinquetti exactly, the textures were crisp and looked gorgeous with S3 metal enabled.
Glad you’re back
Excellent and detailed review! Glad to see you making videos again. Hope to see your videos more regularly. 😁
BEAUTIFUL REVAMP😊
Awesome video, I sold my S3 collection to buy 2x CHA001 Centaur Technology motherboards. I had S25s, S27s, a 530 GT, 540 GTX and a whole shew of 5400EW cards. I could see the progression and then the stall in performance.
If you find a UH4/UH8/UD8/5400EW on the Internet, then it's probably the one I got obtained by emailing a bunch of Video Wall makers, lol. They really like to hold their drivers close to their heart. The 5400EW's device ID slots nicely into the Chrome 400/500 drivers. I was able to use the 5400EW with Chrome drivers, but there is no advantage in doing so. The last 5400EW drivers are dated 2016, so it's weird to see them use the same product since 2009. The 5400EW is clocked a bit faster than the Chrome 645/640, and can slightly beat it in gaming.
I've been messing around with a Zhaoxin c960 and it still has the same problems of the 5400ew and Chrome 645/640 bad directx 10 support and very poor OpenGL performance.
Bad API support is often fixable if one looks at the hardware side a bit in a different way than basic driver devs did, so instead of just mapping function to function so to speak you look at the clusters of operations in terms of what they are used for, often you can either cut out function calls or straight replace them once you understand what a specific cluster is used for in terms of what the aim is. I don't remember any examples off the top of my head but there are multiple documents, like whitepapers, that go into for example Dx9 and Dx10 function calls in clusters in the stack, I mean sort of like MADD optimization but for more complex calls and in more heterogeneous streams. like op1, op2, op3 repeating can be subbed for just op1 op2 and op3 called only every 5 op1 2 pairs, if that makes sense. It depends a bit on how deep he stack is and what is kept in cache, but there may be a lot of performance to be gained from this sort of analysis and removing redundant calls. I mean performance gains in the order of 20-50% for certain clusters of calls. Both nVidia and AMD did similar things a few times, in particular going from the first driver releases for HD2900 series and 8800 series to the drivers released after the newer gen chips after had shipped.
OpenGL is usually less 'open' to this sort of thing, but directX very much is.
Awesome video mate nice to see some finally did a review on it
I was approached by my supplier back in those days to try one but I never did.Used to run a big internet Cafe in Turkey but I stuck with my Ati cards. Been using ati and amd for the last 35 years and never looked back.Keep up the great work love your content
They also made the Chrome 400 and 500 series going by information on wikipedia. There is only a few low quality videoes about them on youtube. Would be intresting to cover as they are DX10 capable cards.
Longtime fan. Glad to see you're still making videos. I had an S3 Savage4 Pro for a while in 99/00 but replaced it with a Creative Labs Geforce2 GTS in summer 2000 and never looked back.
That must have been a huge upgrade, Geforce 2 GTS was a beast at the time. I had an S3 ProSavage integrated chip (integrated Savage4) for a while, and it was a really weak graphics card, even back in the day. It struggled to run basically anything 3D, and I had to mostly use a low resolution of 640x480, when most gamers were already using 1024x768 or higher. I remember being frustrated by that.
Well you can see as so obscure that you are unveiling the history. Thanks for writing a new old page of tech history.
Great video! It was very interesting!
Happy to see new videos again
I had the S3 Savage 4 pro. Ah nostalgia.
Most people don't remember this because it was a long time ago in the world is was different back then, but you couldn't just hop on Amazon and pick up whatever the best thing was back then, you basically went to the store and you got what they had. So I picked up an S3 Savage 4, and thank goodness it was not great but not terrible because that's what I got
I never thought S3 would go on after the DeltaChrome and IX - those were the latest S3 chips I ever saw in the wild.
My very first computer of my own had the ProChrome chipset built into the motherboard. It really sucked (outperformed a lot by a geforce 2 mx400), but was pretty fun to experiment with (S3TC and so on). Suffice to say, it was quickly upgraded - can't have something weaker than the one in daddy's PC 😀
Good to see you back man!
This is excellent! I always wondered what happened with those cards. Even with the middling performance, it's a shame they weren't reviewed upon release.
But if you factor in the lack of board partner support, having to purchase them directly and their incredibly tardy introduction, it does make much more sense!
our man is back 😎
VDesk is what is called Workspaces in the Linux world at the time, seems like an interesting code borrowing since the limit is 4 which was also the standard back then as well for Linux.
Even though they weren't doing well, they have experimented with bleeding edge technology that has eventually found their way into recent video processors from AMD, and finally, NVIDIA, as S3 apparently were experimenting with superscalar out-of-order vector units to see how they can get significant performance boost outta it (easy answer; SIGNIFICANTLY, that recent GPUs sporting superscalar out-of-order shaders are far more powerful than in-order counterparts).
PixelPipes drops a video and its a good day indeed.
Öhh what? Have I walked into another world by mistake? a PixelPipes video released today? must enjoy with some coffee and contemplate the Mandela effect I am experiencing now when watching a PixelPipes video :D
Also: That sound that sounds like a sick cat noise at 20:00 and again at 20:04 & 20:08 had me looking around to see if it was my cat making noises. Probably most people won't notice but I have sensitive ears and headphones and I can clearly hear it so much that I had to back-skip the video and re-listen again. It's definitely there in the audio.
Wow this was really cool! I'm a sucker for late 90s / early-to-mid 00s graphics cards.
Recently found your channel and subbed, very nice content!
I think Savage 4 was the last "card" of theirs I used. Then there was the "pro savage" igp.
I know there's the S3 Chrome 430GT which has dx10.1 support with 32 shaders, 4 pipes, 4 rops. 256mb ddr2 memory. Oh and the S3 Chrome 540 GTX
i remember the savage 3d in the first pc i build back in the days...but after things like the voodoo3, geforce 256 and co never heard anything about them
What a blast from the past I didnt realize S3 made GPUs in the age of FC-PGA
I think I remember seeing a review of this on FiringSquad.
That was really a very interesting video and card, so thank you.
wow not seen one of these in ages, I probably still have the odd newer 400 or 500 series kicking about.
I played HL2 on Virge DX or Trio. It was a suffering exercise but it worked to a degree.
Thanks for the great review of this very interesting and rare card!
I've been looking for one of these graphics cards for years just on the premise that it seems to exist only in wiki lists of S3/VIA cards that supposedly existed. That alone peaks my interest. These are the exact tests I wanted to run too. Thanks for giving closure on a card that probably isn't actually worth seeking out unless you're trying to start a museum of obsolescence. Which I totally would do given the funding.
lol you're welcome
thanks for the review, i've read somewhere that the Gammachrome had some hardware bugs in the memory interface, which they fixed in the later Chrome S25/S27. Sadly you could not buy any of the cards back then, imported one Chrome S27 from the US to Germany in 2006 and it's still alive with some S25 and 430GT and 440GTX (DX10.1) cards. 😅 S3 Fanboy back then. I hoped they would do one lucky punch to stay in the business, but VIA does not seem to have interest in it
Actually, I loved my Savage 4 16MB VGA. Unreal looked awesome on it and the performance was really good for the price and features, in my humble opinion that I greatly respect. I still regret not getting a Savage 2000 about 20 years ago (when it was dirt cheap after its faliure) for my collection. Somehow, at that time I was giving away to my family and friends every piece of last gen. hardware without ever looking back. The only thing I kept was a Pentium MMX 233 from 1997.
Hardly anyone expected that we'd have nostalgia for hardware that was thought of as "tools", so you're not alone
Heres a comment for the algorithm.
Thanks for that review! I honestly thought S3 3D cards (not integrated solutions) died with the Savage 2000, so this is an interesting learning! And I'm surprised that they could at least compete with some mid range solutions (and was baffled you could find drivers for it).
I had a Trio64, a good 2D performer, and also a Savage4, which was probably in the same league - not too fast, but also not slow and capable enough for most games. It's a pity they rushed the S2000 (broken T&L) and ultimately failed to compete, despite having such a large market share in 2D and early 3D times...
A very forgotten card, I had troubles getting info about this when I wrote its chapter. Thanks for the video!
thats pretty cool I have the Chrome 430GT card, I have a few mobo's with the gammaChrome iGPU but I dont know if its the S18 one.
Finally found proper geek channel. Thank you.
Yet another interesting video!
Back than i put my bets on S3 mainly due to S3 metal in Unreal Tournament, while a bit buggy, was not looking bad at all.
Very, very interesting video! I haven't watched anything about S18 before. Thank you very much!
Thanks for the content.
It be nice if PowerVR-Imagination Technologies/Matrox/S3/SGI, came out with a Card that KOs arc and competes with AMD and nvidia
I had a Manli S3 Savage 4 card 32MB sdram, it was the first card I ever bought, it was much cheaper than a Geforce 2 MX 32MB a schoolmate of mine had and I thought I was more clever than him that bought a 3D card with 32MB VRAM as his for almost half the price! Little did I know back then... The card was just weak, drivers a pain and very simple, and when I paired it with a Celeron 800Mhz cpu it needed 5 minutes to show the Vbios on screen for some weird reason... It is the only card I threw away after I bought my Kyro II 64MB which was a dream compared to that crap and very good overall.
I didn't even want to sell it or give it to a friend or relative I just threw it to trash.
Happy to see you back! Very nice video for a mediocre card but one I had never heard before, Cheers from Greece!
It is quite possible that it was a 32bit card (POST times come down to the motherboard - have one that behaves the same with any card). Recently aquired a 32MB Savage4 with a 32bit bus...and performance is aweful. Even a Pentium II 350 isn't a bottleneck it is that slow.
My full 64bit 16MB Savage4 (same manufacturer as the other, identical PCB) on the other had trades blows with a Riva TNT in some games, TNT2 Vanta in others. Barely scales beyond the Celeron 500 however - swapping it for a PIII 1000 saw only a couple of FPS improvement. The Vanta however thrashes the Savage4 with the PIII 1000. Might at some point trial it with the Celeron 633 that came with one of my boards just to see how it performs - never put it in a working board however so no idea if it is functioning.
Man even so now I want to get one myself as a collector of rare cards this makes my hardware Hart go wild thanks so much for doing this one i realy feel i need to get one so good work mate my passion is lid up ;)
Anisotropy in OpenGL is a floating-point number.
So you found a weird gem that allows non-power-of-two levels to be forced in the driver, but you still haven't seen anything expose the -full- range of anisotropy :)
Who remembers running DUKE3D on an S3 Verge or ET6000 with UNIVBE Enabled.
I kind of remember the s3 webshop, as at that time I was on the market for a.graphics card, and my previous card was a Savage4, so I thought: why not buy the next s3?
but there were two big problems:
- the price was surprisingly high compared to other products at that time
- it was not available for shipping from europe, and shipping from us at that time was more difficult than nowadays.
I always liked quirky hardware, so I was a bit sad that there was no option for me to buy one.
I meant to mention somewhere that I don’t think the webstore sold to Europe, but I forgot. Thanks for confirming!
The Trio 64v+ was legendary back in the day, so much that it was emulated in virtual machine software. Shame that their 3D offerings paled in comparison though they did have the last laugh with S3TC compression.
I'd like to see how exactly it compares to the S9, given that the S9 specs on paper seem better in some ways.
It rides Chrome and Shiny in the roads of Valhalla :v
PixelPipes is here again. We missed you and your great videos. Recapping is mandatory with many retro stuff. I have recapped with great effort my Gigabyte GA-7N400 Pro2 s462 MoBo with quality japanese solid caps and it runs like a charm.
I was reading about older cards a while ago. When I was a kid, a friend of mine had a Savage4 paired with a K6-2 and it didn't perform as expected.
Compared with the TnT 64 M2 and a similar Celeron, it sometimes took a beating.
From reading about these on Anandtech, it wasn't that the card was bad, it was a driver thing. It seems K6-2s were decent for gaming if you used Voodoos, as they had 3DNow driver support.
When you paired the K6-2s with Nvidia or S3, things were not good. Details, details...
Unfortunately, I never saw Savage class cards running on Intel but they seem to have been good enough. After that though, not really...
I was so broke once I had to buy a FX 5200 it was awful but better than my onboard I was told not to buy it but you live and learn.
Me too... It is the only card I want to forget owning it and my last nvidia product no regrets.
3:25 - *_"Improved virgins"_* -- excuse me????
🤣 Improved versions? Improved virges? Idk words are hard
Really like the fan shroud design its a same it doesnt perform as well as it looks, maybe the reason you got x2 coolers was to swap the sticker over from the new one to the old one.
Looks like a driver war more than a hardware war going on, with those 1% lows
Should have brought in a 6 series nvidia card, since it was the generational competitor to the ati x series. I remember having a 6600gt at the time, was a great card.
This is a rare card!!!
I don't think characterisation that the classic Virge is slower than software rendering is entirely fair to the hardware, though it is fair towards the product. The driver overhead was high, driver compatibility was bad, but also... you end up comparing 640x480 hardware accelerated rendering to 320x200 software rendering, trying to paint 5 times as many pixels! If it was possible to use compromise resolutions such as 512x384 or 400x300 in S3D games, it would have made a much nicer impression. If there were more than 3 S3D games.
It's interesting to see this one. It's not too surprising to see drivers being slow and buggy for a card that didn't end up releasing for real, but it may have been one of the things that doomed it, alongside delayed release.
we sold lots of virges, they were great card to run along side voodoos of the time
looky-looky, a new video after all..
During that period of time, the CEO of VIA was touting how much cheaper their stuff is and how small a die area it takes up.
I think this is why the chip got a massive cut in the pipelines.
The husband and wife team of VIA CEO (wife is also CEO of HTC, yeah, THAT HTC) likely were never in it for the business to make money, but to make money from the company stock. Long term investors in their stock were usually screwed in the end.
I mean its better than an geforce MX, but like TNT's and voodoo 3's those were damn near free ... having to web order it against a Geforce FX available almost free its a hard sale ... but for an S3 card its not terrible
I can't help but wonder if they had yield issues with the 8 pipe chip design and decided to cut down to 4 in the final hour to salvage silicon. It would have been interesting to see how the cards would have performed with the full 8 pipes.
Maybe we'll get lucky and a former S3 engineer will see this video and illuminate this for us
I am not sure what card it was I had, but I am sure I recall having an AMD card with an AGP / PCIe bridge chip, it might even have bridged a newer PCIe format GPU to the older AGP standard.
Think we could get a scan of that artwork? its great!
been watching for years, you're really good at this, the channel deserves more subs but I guess the niche is too small, perhaps branch out into other channels/types of reviews in the computing space?
I appreciate it! I'm really not worried about the numbers anymore. That quickly sucks the fun out of it
This was the graphics card youd find in a like a packard bell or a lowend HP
The amount of disappointment, and frustration, via is known for, makes me wonder _how_ a consistently bad company is still allowed to continue making said products to a market long since apathetic to their shenanigans.
Side bar: so glad to see you are back, my dude! 😎
Some of their earlier products and late 2000's stuff were genuinely pretty good but yeah... they were mostly inconsistently on the bad side of products.
What expansion slot is on that Matrox card?? It almost looks like a SODIMM slot?