As the viewer who lent you that board it was an amazing video but there are a couple of remarks I'll like to do about the setup: 1. Windows 10 works on the board, I was able to install it and use it with no problem on the board and without changing anything on the bios. 2. The GPUs have dual GPU cores and a SLI chip on the board, so it is like any other GPU as the GPUs are joined in the board and they work on non sli setups, you just need SLI if you want to use 2 of those dual GTX 760 cards. 3. The green flickering happens because of the output GPU as you said, just plug it into another output and it will work fine, same with the sli performance and stability, it is not easy to make it work but it does. I actually included some information about that with the setup.
One of my old coworkers bought this platform (FX-72) in 2007 while I was piecing together my Q6600 system. I think he ended up spending almost double what I did, and with AMD being quite competitive in those times, we both had very high expectations and were looking forward to comparing performance. I ended up having good luck with silicon lottery and my Q6600 could run 3.9GHz stable, and I had it paired with some fast Micron D9 DDR2. I cannot overstate just how much faster my Q6600 build was versus his FX-72 build, I actually felt bad for him, because it was a slaughter.
My father was an electrical engineer for AMD at that time and he agrees 110% with you, and says it wasn't his most proud product hes produced... yea he helped design and make that and thinks its garbage...
@@TheBloodypeteWrong. An electrical engineer is that exact type of person who would work on processors and design them. You are thinking of electricians.
16:05 Reason why the already installed nvidia drivers for the gtx760 didn't work with the rtx3060 is beacuse at least for win7, the driver support is divided between gtx10 series (and older) and the rtx20 series (and newer). So it would be correct that the 3060 needs different drivers.
@@arnox4554 They actually did drop active support for some while, the latest version is 474 for win7 and they will only provide security updates going forward but no more regular driver updates.
They also do similar stuff to this for Quadro and GTX. I tried plugging a 1080 into an M5000 laptop and the whole install got borked because of driver conflicts and neither card worked. Similar story with a Pro W5700 and a Vega M GL laptop, the Pro drivers installed on top of the normal ones, and so the Vega didn't work until I DDU'd.
@@arnox4554 Ah got it, yeah that's not what is happening there :P Tho some versions of drivers may still run worse or less stable with some specific games, just like it is with all drivers on all platforms.
For those who are confused, this is not the FX line of CPUs of that launched in 2011 on the bulldozer architecture, those were completely different and shared a floating point unit between every two cores which led to lawsuits for false advertising later because they really weren't true eight cores, and their performance reflected that
Yup, people often forget this one. In plain words, the issue was that you had essentially pairs of cores. Each pair shared some resources, and when you were utilizing the second core in a pair, single-core performance of that pair went down significantly. And it needs to be said that even the full single-core performance wasn't great compared to Intel, and at the time there were still a lot of games which actually only ran on a single core. I also feel like people often overstate the "budgetness" of these CPUs. They were cheapER than Intel's but also generally had more expensive motherboards if you wanted feature parity. The i5 destroyed the 8-Core on single-core performance, while the i3, although only dual core, had hyperthreading, so it was often a better option than the quad cores from AMD which, thanks to their design, were acceptable dual cores or weak quad cores (depending on the load).
It would be interesting to see a comparison video showing both AMD's and Intel's failures over the years and how they were able to improve over time. Heck, go all the way back to the 90's when the AMD K6 was introduced as a starting point. Now that's nostalgia.
I would not call the Super Socket 7 and the AMD K6/II/III line up a failure, as it was cheaper for consumers, and the IPC was just good if not better than Intel. I had a K6 II 550Mhz machine I built with my money from my frist after school job, and it played every game of the era I threw at it on Win 98se, and DOS.
@@CommodoreFan64 I think you misunderstood what I meant. Back in the K6 days, AMD was an up and comer that really pushed Intel hard. However, over the years since then, both companies started making mistakes and had to recover from them. That's the kind of video I'm looking for. Go back to the golden era where they were both squaring off, show their mistakes, and bring everything back to the present where they're now back in contention with each other.
@@xtreamerpt7775 Correct. Even before the 286. Speaking about that, if you look closely to an AMD 286, it's copyrighted by Intel at the bottom of the chip. Most of them came out in 1982. They also reversed engineered the Intel 8080 with a simple picture of the core die without iicence in 1975, that they codenamed Am9080. And they were around even before that time (they were founded in 1969 actually). After that, Intel gave AMD a licence to produce CPUs for them as a second source, which lasted until 1984-1985-ish, when AMD started making their own CPUs, like the Am386DX and DXL-40 models, for example. But ALL Am286 CPUs were produced under licence and copyrighted by Intel.
@@kimnice I reacted to this bit myself - why did they just keep mentioning nVidia cards on an AMD system? The obvious thing - I would have thought - would be to test it with Crossfire. Using a couple of 4870x2's would've made more sense. This entire video was a waste on what is a piece of rare computing history.
I super appreciate those seizure warnings. Luckily I never got one before, but I do get some pretty bad headaches from rapid flashing lights etc. Thanks, editing team!
I remember "upgrading" from a Core 2 Duo E6600 system to an FX 8350 system, and if it weren't for the actual upgrade from an 8800 GTS 320MB to a GTX 760, I honestly felt like my PC barely got any better, certainly not to an extent that I was expecting going from 2 cores to 8 cores. I recall CS:GO struggling hard, meanwhile watching older Intel i5s that ran it better than my system with the same GPU... Then I switched to an i7-4790k with the same GPU and RAM (I think? pretty sure they were both DDR3) and man, CS:GO FPS tripled and other games easily saw a 25 - 50% boost. That CPU was SO BAD. I gave it to my step dad, he had a "better" use-case for it, with video rendering/editing, rather than gaming, but he still complained of system instability and it feeling slow, even when we got him an SSD to install Windows and programs onto. It is hard to find game benchmarks comparing an e6600 to an fx 8350, but one I found did Valorant on a 750 Ti for both... the 8350 got 35 FPS @ 1080p.. the e6600 for six years prior with 6 fewer cores? It got 23 FPS. Obviously both are brutal, the 750 Ti isn't even all that good in comparison to the 760 I wound up with, but still.. I expected so much more after waiting so long to upgrade, and its performance in Windows and programs was just soooo bad and sluggish, especially compared with the 4790k. Night and day differences. I am back to AMD now though. Went from a 4790k to a R5 3600X, then I went to a 5600X and now a 5800X3D all on the same X570 motherboard. That i7 was great though. I had it paired with a 760, a 980, a 1080, and a 1080 Ti. Now running 3070 Ti with my 5800X3D, never been happier with my PC, I run 1080p high refresh rate, and it feels like this PC could last me another 3 - 4 years at least honestly. Not worried about the 8GB VRAM, it hasn't stopped me from running any games at 1080p with RT yet, and DLSS means no worries honestly as I've only had to use that in a couple games at Quality setting so far, so going to Balanced/Performance is still an option down the line if needed, or turn off RT in some titles, or turn down a couple settings I don't value much. The thing with my current set up that I like so much though is how flippin stable it is. I *never* blue screen, games never crash, everything just works perfectly. Haven't had this level of stability from a system probably ever. It has been 1.5 years since I reformatted/reinstalled windows even and it is still just smoooooth.
I did the same, had a FX-8120, 8350 and then was gifted a 9590. Got sick of trying to keep the 9590 cool and got a 4790k that lasted me until 2021. Still is a decent gaming cpu even by today's standards but I moved on to a 10900 and then 11900k that I got for a great deal. (11900k + good z590 mobo for barely over $300 NEW)
I went from an (iirc) fx4100 which was dogshit, not even actually 4core, and couldn't run jack shit to an fx 8350 that was still dogshit. It overheated constantly even after thermal paste replacement and a box fan blowing at it. Absolutely awful CPUs and put me off AMD CPUs for life
@@nater420 You tried two cpus from the same generation and had the same bad experience and now are put off from amd? Come on now. Ever since ryzen amd has been very good.
@@Jtwizzle It's already not for quite a few titles unless you are ok with dropping texture quality down a bit (which is one of the biggest visual impacts by far).
You could run those on non-SLI boards. That was the main gimmick of the dual-GPU cards... at least as far as I know? My newest one from nVIDIA is the GTX 690 so maybe it changed later.
2006 was a time when it was better to have a fast dual core for gaming than a quad core. The problem at that time was that many games could only use a single thread. A frequently used gaming CPU at this time was a core2duo E6700, which was much faster for gaming than, for example, a core2quad Q6600. The AMD fanbase at the time was using the Athlon X2 6000+, but this CPU was a bit slower than the E6700.
I still got my C2D E8500 with its GT 9800 GTX+ and 8GB of ram, suprisingly fast still today, you can work on it. It's the second fastest Gaming CPU of its time, only the E8600 was faster, and you had to pay 30% for 2x160mhz more. 3,16ghz and 6mb of Cache.
ah the q6600, i had mine for years. Overclocked like a beast! Remember selling it for £25 after Uni...went to a 4770k which again lasted me forever, but CPU's had stagnated for a long time around then, just got a 5800x3d and hoping to have struck onto another long laster!
As someone who owned one of these. The biggest issues with the benchmarks from the day is they were done on Windows XP, since Windows Vista hadn't yet released by the time this came out. I ran Vista and 7 on the board and actually had much better numbers than reviewers did. But this setup wasn't really consumer ready, but was neat.
Biggest problem of these setups was ...... not benchmarks. It was insane inefficiency of 2+ CPU AMD setups. Quad Opteron servers didn't even had performance of 2 CPU's. On Windows Server, it was total mess. On Linux, it was still awful.
My first home built desktop was with the FX6300... While it wasn't great at productivity stuff, it held up to High/Ultra 1080p gaming until about 2015/2016. That little chip means something to me. Surely not because it was great, but it was definitely serviceable and was my first time seating a CPU.
Same here, FX 6300 Black Edition. Started out with a GT 610 and later got a GTX 1050. It wasn’t able to run GTA 5 though, would always crash on that first mission when you play as Franklin with Lamar
I shot myself in the foot by getting a FX 4170. It worked fairly ok with the games I played but took forever to encode my videos. I also didn't need a heater in the winter lol
Those Bulldozer and Piledriver CPUs are a completely different lineup than the ones in the video though, by about five years. I still have the FX-8350 and I believe Steve from GamersNexus still daily drives his FX-8370
Same! I loved that cpu! It was 2013 and I was running it with Gt630 and than R9 270x, that system put me through a lot! If there wouldn't be Covid I would never had to upgrade that poor computer. It was really keeping up with anything I wanted to play and it aged well... its now in family PC mainly for printing stuff :/, but man, it still works great! I would love to see a FX-6300 video, something with overclocking it to 5Ghz and playing modern games
Ah yes, the Quad-Father. I would have killed for this back in the mid 2k's. It was pretty much a hamfisted attempt at catching up to intel's core2 quads.
AMD developed the platform within the same time as Core 2 Quad's development, and they released 30 days apart from one another. AMD didn't have much chance to tweak anything to stay ahead, and K10 was still a few months away. They banked on the "FASN8" program where K10 would be a drop-in upgrade and allow for 8 cores, and once 65nm yields for K10 proved to be a nightmare and the whole architecture had to be cut down to fit (axing most of the L3 cache) they cancelled that plan.
Anyone remember the ASUS MARS GTX 295? 2x GTX 285 spec chips with 4 GB VRAM versus 2 x GTX 280 spec chips with 1792 MB. That thing was a beast back then! A very expensive one I could never afford though.
The 760 was the card I bought because I ditched SLI, finally recognizing that the fps counter was meaningless if it played like crap. Never thought someone would be making a dual gpu care in that era .. let alone buying one , let alone midrange cards :). I went with a 2x hd3870 (dual gpu in one card ) at the start , single hd4xxx series , hd5850, dual hd5850, dual gtx560ti. I finally leaned my lesson by then to just get a single card, no matter what .
The Vega VII didn't paper launch. It still is a beast for crypto mining comparable like a more efficient 3080. They were mostly bought out before reaching consumers. Have seen my fair share of Those fragile beasts
I'd love to see a video where the challenge is to get 60fps in a handful of modern games using the oldest hardware possible, including OC and used parts! Good reference to see what still holds up
Back In the late 90's, I had an Abit BP6 dual socket 370 Mainboard, with two Celeron 466 cpu's overclocked to 525 MHz. The only problem was that at the time the only SMP O/S available to me was Windows NT4 Workstation, which barely supported any games. But it was a very cool mobo at the time 🙂
When my current PC was first built it came with an FX 8350. Compared to the previous PC's Phenom II 1055t, only BeamNG and Space Engineers saw any improvement at all and some applications even got worse outright with stability issues. I didn't hesitate to upgrade it to an R5 2600x the moment affordable Ryzen MBs started being sold in my area. The improvement was night and day. My GPU wasn't getting bottlenecked anymore to the point I was even able to upgrade my monitor setup. (There was a whole other can of worms after that, but it was more due to ASUS related shenanigans than a fault of the platform itself.)
I remember that era well. I was an AMD fanboy up to that point. Went from Athlons to an Opteron 170 as my first dial core CPU. But, yeah did AMD drop off a cliff fast. When, it was time for my next upgrade, I really had no choice but to switch to team blue and the stupidly good C2Q 6600, that thing was an overclocking beast!
I had dual Opteron 270 system for my gaming rig around that time and it did rip copying data on sata drives with two nforce 4 chipsets. It did make a difference against DDR2 dual cores at the time for some reason until quad cores from AMD came out.
I have a still functional system using the the non-ECC version of the board with the FX-74s (the PCIe slots are blue vs red), and I actually liked it overall. Yes it ran hot, but I had the thermaltake vortex coolers to handle that (which look pretty killer too i might add), and I later used liquid cooling specifically for the GPUs (nVidia Quadro 4000s, later upgraded to K4000s). It can actually play Doom 2016 at 720p on moderate-ish settings (deffs not 60fps, but playable). You can install Windows 10, and even Windows 11 if you remove the TPM requirement. WIndows 11 is pretty awful performance-wise though. I think the biggest killer was that the RAM slots and onboard devices were known for going bad or being bad from the factory. I remember on cold winter nights i would literally drape a blanket over me and it (at the foot of my chair) and just use the machine to keep me warm. lol.
I had a Athlon X2 black edition in an am2/am2+ asus board. Later upgraded to a phenom ×4 955 black edition. Used that for many years with a GTS 250. First gpu was 8500gt.
I have a MSI k8n master2-far with 2 opteron 280, and it is similar to the quad fx platforme. Really good on the paper (SLI, 12gb DDR400 ecc, 2*2core cpus, 2gigabit lan, on a e-atx form factor,...) but released in 2005... 1 year too last for it to be interesting.
Based on the performance charts.. I wonder how they even Marketed this Quad FX then since it was still not good? Did any applications use more threads back then? Some type of engineering program, for example?
Next up: one of those old (circa 2006) Tyan dual-core Opteron motherboards with 4 CPU sockets plus the Tyan M4881 daughterboard that adds an additional 4 sockets for a total of 8 CPU sockets and 16 CPU cores in one machine
Hey Linus, i still use my Asus KCMA D-8 which uses a dual opteron 4130 setup with ddr3 ram and use it as a home server setup, A setup from 2009. I'm so glad you are enlightening the world about the world's best platform.
You're forgetting about the Bulldozers, I took the risk of buying a "FIRST EVER 8 CORE DESKTOP CPU!" instead of that lame quad core 2600K from intel, boy did I pay for that, the FX-8120 I got overclocked all the way to 4.9GHZ on air but was still 30-40% slower than the Phenom II x4 955BE I upgraded from
I remember there being some really cool build back in the early 2000's that used multi GPU and multi CPU setups. I would really love a video about what people were doing back then with those.
I had an old crossfire setup with a pair of 290x 4gb cards on water, that I used until 2018. Once I took it down, one of the GPUs simply would not come back to life on it's own, it would recognize if I had it in a rig with the other one but it was totally DOA, no picture nothing, when it was in a rig by itself. When I had it as the primary GPU it would do that same thing with the flickering textures and other nasty problems, clearly it had some bad VRAM but also I think the output engine was broken because one of the outputs didn't work either. I had no idea anything was broken until after I took it apart lol
My old PC had an AMD FX-8350 Eight-Core Processor from 2012, and it honestly wasn’t all that bad. Yes, it ran very hot and really raised the temperature of my room, but it was snappy, great for gaming, and handled anything I threw at it. It still performs well today.
still have a FX8350 around somewhere, was great fun to overclock (5.1Ghz air all core iirc) but while tinkering was fun getting things to run without frame time issues was near impossible, at least on my chip.
my first computer was a dell optiplex 780 sff with a core 2 duo (unsure on model) then upgraded it to a core 2 quad q6600 and damn it went for what it was. great little computer and i got it for $30aud way back lol
@@lands1459 interesting, the guy i bought it off mustve got a different cpu at some point or as an upgrade cause after a double check it had a E8400 duo @3Ghz
My 15 year old AMD PC the HP dc5850 was great. It had a AM2+ socket with a Phenom X3 and later a Phenom II X4 B97 (4x 3.2GHz); max 8GB of DDR2 (800MHz) and 2nd gen PCIe with a 128GB SSD on SATA-2. I used it till April 2019, when I replaced it with the 2nd slowest Ryzen ever (Ryzen 3 2200G), but still a huge improvement 2x the speed of the Phenom II X4. That 2008 HP dc5850 is still in use with a friend. I love the AM2+ and AM4 motherboards, they both supported at least 2 generations of CPUs.
Don't think there's too much difference, as the Intel quad cores at that time were simply two dual core chips on one package with no direct connection. Much like this dual CPU setup from AMD.
Even though it sucks, it sure is awesome. This is one of those things that you build today just out of pure fun, because you can. Multi GPU and CPU setups were absolutely a blast to play with back in the day. It usually wasn't practical, but it was a blast to tweak.
I've never heard of quad FX before. Apparently they could do well for productivity, but that motherboard is a weird one. Why does it look like there are 2 different RAM clips installed(6:48),this sounds so unusual, especially if there's just 1 type of RAM supported. A question for you: Is this motherboard able to give the CPU's a nice overclock? How far could you go? The clock speed of the newer chips was much lower, probably to stay within given power limits. The RTX 3060 has drivers for Win7, but you will need to uninstall the older drivers first with something like DDU.
The editing in this video should be the gold standard for LTT, they even impressed me with the quality of seizure warning. It is appricated when they warn people about loud sound or strobing lights.
@ 4:59 : I presume the article was written about the same of reviewing the product from about ~2006??How comes there is a commercial for a Core i9-11900? The 11900 wasn't released until about 14 years later.
I remember the Mars 760's. I honestly almost bought one. Kinda wish I had now cause I collect dual GPU cards and finding them is incredibly difficult. I would have hated myself at the time, but I would have one now for my collection.
I didn't recognize Gary till you mentioned ASUS, met him at HardOCP's Gaming Convention thingy. (Met Kyle Bennett too) both awesome guys! Also I still have an AMD FX-8350 + GTX 980 as a spare gaming rig. Also have a C2Q 9650 that runs a GTX 960 (before that it was a 560 Ti)
I remember upgrading from an athlon xp to an Athlon 64 FX-74 and it was such an incredible upgrade. I also remember going into fry's and the sales rep trying to sell me on quad fx (or more like trying to sell my dad on buying me quad fx) . Glad that didn't happen lol
How did you upgrade to an FX-74 if you didn't buy the Quad FX system? The FX-74 only works on Quad FX, it was made specifically for that board and was only sold in pairs (2 CPUs per box).
@@FixedFunction lmao you made me go dig out that old pc (in the garage) and to my surprise it's an FX-62, guess I got it mixed up (I was like 12 at the time /shrug)
I have a retrocomputer with one of those! I purchased it in 2017 for a celebratory bonus build in parallel to my TR 1950X build. I have a number of retrocomputers with... intentionally weird and questionable parts, including the infamous nvidia leafblower (FX 5800 ultra) and a 3dfx Voodoo 5500. The Voodoo 6000 and XGI Volari Duo both evade me sadly. ...I also have some systems with modern weird and questionable parts too, like a working system with Via/Centaur's last x86 cpu (the CHA/CNS chip) before Intel acquihired Centaur.
New Chinese dual Xeon ATX motherboards are 100 or so, the 4-18 core CPUs pulled from big server farms are very cheap. X79 uses quad channel DDr3, X99 uses dual channel DDR4. You can buy 19" rack dual CPU server slabs for 150 upwards from A******n , they generally have two PCIe slots at the rear, also a power connector for GPU lead. fit a USB3 card in the second slot. the biggest caveat is they have lots of small fans and are jet engine noisy at start up..
10:47 (Seizure warning) Most of my computers since 2010 have been 2-4 way SLI machines. In my experience, that exact flickering is caused by a bad connection of the SLI bridge, due to oxidization or whatever. Usually just wiggling the SLI bridge around should fix the issue
Amazing to see Opterons in the channel. I remember I was this close to buying a G34 socket server motherboard to get some of those cheap CPUs... But you can imagine why I didn't for it. It could've been incredible for server things, but Ryzen going more affordable and more reasonable (even Threadripper and Epyc) basically negates all purpose, unless I go "just for fun".
I did play with the green flickering for some time. I had a Radeon 240, I think. To run games like GTA 5 at 30 fps, I overclocked it to the point that it shouldn't have worked. To fix it, I had to tinker with bios settings. I have no clue what I did 7 years ago, but I broke it a year later.
That still powerful from my 10th gen i5 😀back in then, amd decided to put more but not so powerful cores into processors. İ guess they were like, ohhh mate if we add more cores it will be more powerful
@@doublevendetta ohh i Dont really recognize nor i have Heard quad FX thing but i didnt think it was that New i think that multicore thing is 2011 thing. So sorry if i was wrong
Man these were the days though that overclocking WAS necessity out of the box. Where you'd instantly get 25% instant increase... it would be interesting but I don't know how safe to do with its age now.
the radeon 7 was not vaporware, it just was completely grabbed by miners i assembled rigs and in personally mounted hundreds of these, at the end of ETH life you could push 100mhs EACH at less than 200w power draw it was not until the 4090 arrived that i was beaten with 110mhs but at OVER 300w so in the end the Radeon VII remained king of GPU mining
9:18 I thought Linus was going to break into song. I was slightly disappointed. as for the actual topic of the video, I used to run SLI (dual gpu)... I can't imagine throwing two more gpus, and a cpu in. It is bound to fail.
I have to wonder as to the GPU choice for upgrade that was tried here. I think a more appropriate one would be a 1050 or RX 580. They both would beat the crap out of that 760 and have better support under W7. Consider giving it whirl even if just in the lab.
Pls don't say FX, pls don't say FX, pls don't say FX, pls don't say FX. PLS! 1:20 later. Ohhhh crap. Sorry honey, none respect you. Only me! My first CPU running for at least 10 years. AMD FX 8150!
I just had this running in the background while assembling a bike. I wasn't paying attention, but loved the throwing of Gary under the bus over a past CPU choice!
Could always try a Quad Opteron setup based on the Piledriver platform using socket G34. Can get Operon 6380's relatively cheap compared the next step up, the 6386.🤔
I had a pair of GTX 260s in SLI ages ago and in games that supported it well it would render on both GPUs and scaled well enough but it turns out the bottom GPU didn't output video on any of the ports and I have no idea if it had always been like that or if not for how long it was like that lol
How many LTT vids haven’t I noticed are dubbed or subbed in other languages. Is it something they retroactively on older vids, I haven’t seen it in new release.
Without the context of other platform, such as the Core 2 Duo, i can't tell if it is actually slower. While you guys told me it's slower i wasn't shown that it is. I think a better video would be to run the original quad fx then compare it with this upgrade path and its competitor. Then again it would be buying a bunch of e waste so im not sure what i would have done
Yt playback is something I've started to notice problems with my 4th gen laptop i7. Fans need to kick in if I have the webpage visable. But embeds or minimizing the page is fine. The page itself is somehow a problem
Shout out to the optical audio users out there at the time! (I was only one due to a Best Buy warranty gift where I got "upgraded" after buying a floor unit for $100 to a surround sound $1100 unit, but still. It was amazing.)
Getting Event id 86 with random shutdowns. tried reinstalling win 11 and clean installing win 11 and even win 10 but to no avail. Anyone else is having the same problem? We need its fix cause many guys with ryzen cpus are facing it.
I wonder if the 3060 didn't run due to missing SSE4/4.2 instructions maybe? As you say though, it would be massively overpowered -- running Ubuntu (with wine/dxvk/vkd3d for gaming) on an Ivy Bridge (which has about 1.5-2x the performance of QuadFX per Google), GravityMark can max out the card but most games (that are not limited to 60FPS or monitor refresh rate) will top out at about 40% GPU utilization. The framerates are nothing to complain about but CPU core that's driving the card will just peg out well before the GPU breaks a sweat. I did get CP2077 up to 80% utilization by turning some of those graphics settings that were set to low up to medium though -- nice to be able to turn up settings with 0 FPS drop since it looked rather bad on low to be honest.
The thing with platforms like these is that they don't have a solid upgrade path which brings things to today with the Threadripper platforms. Essentially every generation of Threadripper. Sure the first generation got an upgrade from 16 core to 32 core max but they were the same Zen 1+ architecture. Then then was the TR4 socket for the Threadripper 3000 series which never saw a new chip released for them. Ditching the previous dead end socket for another dead end socket. Then there was the Threadripper Pro which added yet another socket to the Threadripper history but hooray it did provide a generational upgrade with some gotchas. The Threadripper Pro 3000s were mostly OEM only with off the shelve parts for DIY builders only appearing shortly before OEMs had access to the Threadripper Pro 5000s, thus making Threadripper Pro 3000 in the DIY area kind of rare: an upgrade path did exist for the dozen or so people who had one. While the consumer and server segments of AMDs line up got some 3D V-cache parts, Threadripper again was neglected. The saving grace for Threadripper Pro 3000 and 5000 is that the many DIY motherboards can also work with Epyc chips. One downside of being mostly an Epyc chip for Threadripper Pro is that they can be locked to an OEM platform, complicating the used market.
9:55 I think the delay there says less about the performance of the platform and more about how modern webpages are overloaded with scripts, tracking and other shizz...
I can't believe it took this much from AMD to match my i7 620m laptop CPU. 8 core can't compete with 2 core 4 thread goodness. I guess the reason is because I'm not really multitasking, I'm running Linux Mint, and I have 1.33GHz spared for each core rather than 550MHz per core on their end. I can handle about 1440p30 or 1080p60 before my cpu chugs. Is it weird to me that 1080p60 is sometimes harder to run than 1440p30?
As the viewer who lent you that board it was an amazing video but there are a couple of remarks I'll like to do about the setup:
1. Windows 10 works on the board, I was able to install it and use it with no problem on the board and without changing anything on the bios.
2. The GPUs have dual GPU cores and a SLI chip on the board, so it is like any other GPU as the GPUs are joined in the board and they work on non sli setups, you just need SLI if you want to use 2 of those dual GTX 760 cards.
3. The green flickering happens because of the output GPU as you said, just plug it into another output and it will work fine, same with the sli performance and stability, it is not easy to make it work but it does. I actually included some information about that with the setup.
But who is gonna take time to read your instructions, am I right? /s
@@MuitoDaora
NOT Linus of course
Brilliant man... thanks for keeping, tinkering and sending this board in. Such interesting content.
Thanks for sharing this board with us!
I am glad you had one to loan them. I buried all my FX stuff in the back yard.
One of my old coworkers bought this platform (FX-72) in 2007 while I was piecing together my Q6600 system. I think he ended up spending almost double what I did, and with AMD being quite competitive in those times, we both had very high expectations and were looking forward to comparing performance. I ended up having good luck with silicon lottery and my Q6600 could run 3.9GHz stable, and I had it paired with some fast Micron D9 DDR2. I cannot overstate just how much faster my Q6600 build was versus his FX-72 build, I actually felt bad for him, because it was a slaughter.
My father was an electrical engineer for AMD at that time and he agrees 110% with you, and says it wasn't his most proud product hes produced... yea he helped design and make that and thinks its garbage...
man id love to pick his brain about so much stuff as an amd fanboy
my auntie is Lisa Su and said you are saying bullsh*t to farm likes
I'm guessing he was actually an electronics engineer? Electrical is the wires in your house scale stuff...
@@TheBloodypete and magnetism or three phase induction
@@TheBloodypeteWrong. An electrical engineer is that exact type of person who would work on processors and design them. You are thinking of electricians.
16:05 Reason why the already installed nvidia drivers for the gtx760 didn't work with the rtx3060 is beacuse at least for win7, the driver support is divided between gtx10 series (and older) and the rtx20 series (and newer).
So it would be correct that the 3060 needs different drivers.
Thanks for explaining this! I had first thought Nvidia just let their drivers for Windows 7 take a shit and got super mad.
@@arnox4554 They actually did drop active support for some while, the latest version is 474 for win7 and they will only provide security updates going forward but no more regular driver updates.
They also do similar stuff to this for Quadro and GTX. I tried plugging a 1080 into an M5000 laptop and the whole install got borked because of driver conflicts and neither card worked.
Similar story with a Pro W5700 and a Vega M GL laptop, the Pro drivers installed on top of the normal ones, and so the Vega didn't work until I DDU'd.
@@aijena Oh, that's fine. I meant outright shipping broken drivers for Windows 7.
@@arnox4554 Ah got it, yeah that's not what is happening there :P
Tho some versions of drivers may still run worse or less stable with some specific games, just like it is with all drivers on all platforms.
For those who are confused, this is not the FX line of CPUs of that launched in 2011 on the bulldozer architecture, those were completely different and shared a floating point unit between every two cores which led to lawsuits for false advertising later because they really weren't true eight cores, and their performance reflected that
Yup, people often forget this one. In plain words, the issue was that you had essentially pairs of cores. Each pair shared some resources, and when you were utilizing the second core in a pair, single-core performance of that pair went down significantly. And it needs to be said that even the full single-core performance wasn't great compared to Intel, and at the time there were still a lot of games which actually only ran on a single core.
I also feel like people often overstate the "budgetness" of these CPUs. They were cheapER than Intel's but also generally had more expensive motherboards if you wanted feature parity. The i5 destroyed the 8-Core on single-core performance, while the i3, although only dual core, had hyperthreading, so it was often a better option than the quad cores from AMD which, thanks to their design, were acceptable dual cores or weak quad cores (depending on the load).
Yep this wasn't the 2010s FX
But Buldozer were true 8core, but just 8 weak cores. Look and single-core and multi-core scores.
Also
Not to be confused with AMD-FX and not to be confused with Intel Core 2 Quad
Wasn't Intel's first attempt with quad CPU also similar like 2 dual cores basically bolted together until they released Core2Duo
It would be interesting to see a comparison video showing both AMD's and Intel's failures over the years and how they were able to improve over time. Heck, go all the way back to the 90's when the AMD K6 was introduced as a starting point. Now that's nostalgia.
I would not call the Super Socket 7 and the AMD K6/II/III line up a failure, as it was cheaper for consumers, and the IPC was just good if not better than Intel. I had a K6 II 550Mhz machine I built with my money from my frist after school job, and it played every game of the era I threw at it on Win 98se, and DOS.
@@CommodoreFan64 I think you misunderstood what I meant. Back in the K6 days, AMD was an up and comer that really pushed Intel hard. However, over the years since then, both companies started making mistakes and had to recover from them. That's the kind of video I'm looking for. Go back to the golden era where they were both squaring off, show their mistakes, and bring everything back to the present where they're now back in contention with each other.
@@CommodoreFan64 i had ALL those Boards, DAM near Bulletproof ! Those were fun boards to play with,the OLD DOS DAYS TOO ! DLL HELL !! The Old Times, 😎
@@honorablejay Not a up and comer, AMD is around since the 286 clone, i think early 80s
@@xtreamerpt7775 Correct. Even before the 286.
Speaking about that, if you look closely to an AMD 286, it's copyrighted by Intel at the bottom of the chip. Most of them came out in 1982. They also reversed engineered the Intel 8080 with a simple picture of the core die without iicence in 1975, that they codenamed Am9080. And they were around even before that time (they were founded in 1969 actually). After that, Intel gave AMD a licence to produce CPUs for them as a second source, which lasted until 1984-1985-ish, when AMD started making their own CPUs, like the Am386DX and DXL-40 models, for example.
But ALL Am286 CPUs were produced under licence and copyrighted by Intel.
The power consumption of Quad FX is brutal
we threw out about 7000 FX chips when they came out. that's how trash they were.
Imagine having Quad FX with Radeon R9 295X2... crossfire..during the day?
@@kimnice I reacted to this bit myself - why did they just keep mentioning nVidia cards on an AMD system? The obvious thing - I would have thought - would be to test it with Crossfire.
Using a couple of 4870x2's would've made more sense.
This entire video was a waste on what is a piece of rare computing history.
@@Q-nt-Tf You threw away seven-thousand CPUs?
@@I_enjoy_some_things Yeah that don't sound sus 😂
I super appreciate those seizure warnings. Luckily I never got one before, but I do get some pretty bad headaches from rapid flashing lights etc. Thanks, editing team!
Agreed! More accessible videos are always better.
@@collinblack9605 that's probably the skype call sound you're thinking of
NVlink replaced SLI and is only in servers
I remember "upgrading" from a Core 2 Duo E6600 system to an FX 8350 system, and if it weren't for the actual upgrade from an 8800 GTS 320MB to a GTX 760, I honestly felt like my PC barely got any better, certainly not to an extent that I was expecting going from 2 cores to 8 cores. I recall CS:GO struggling hard, meanwhile watching older Intel i5s that ran it better than my system with the same GPU... Then I switched to an i7-4790k with the same GPU and RAM (I think? pretty sure they were both DDR3) and man, CS:GO FPS tripled and other games easily saw a 25 - 50% boost. That CPU was SO BAD. I gave it to my step dad, he had a "better" use-case for it, with video rendering/editing, rather than gaming, but he still complained of system instability and it feeling slow, even when we got him an SSD to install Windows and programs onto.
It is hard to find game benchmarks comparing an e6600 to an fx 8350, but one I found did Valorant on a 750 Ti for both... the 8350 got 35 FPS @ 1080p.. the e6600 for six years prior with 6 fewer cores? It got 23 FPS. Obviously both are brutal, the 750 Ti isn't even all that good in comparison to the 760 I wound up with, but still.. I expected so much more after waiting so long to upgrade, and its performance in Windows and programs was just soooo bad and sluggish, especially compared with the 4790k. Night and day differences.
I am back to AMD now though. Went from a 4790k to a R5 3600X, then I went to a 5600X and now a 5800X3D all on the same X570 motherboard. That i7 was great though. I had it paired with a 760, a 980, a 1080, and a 1080 Ti. Now running 3070 Ti with my 5800X3D, never been happier with my PC, I run 1080p high refresh rate, and it feels like this PC could last me another 3 - 4 years at least honestly. Not worried about the 8GB VRAM, it hasn't stopped me from running any games at 1080p with RT yet, and DLSS means no worries honestly as I've only had to use that in a couple games at Quality setting so far, so going to Balanced/Performance is still an option down the line if needed, or turn off RT in some titles, or turn down a couple settings I don't value much. The thing with my current set up that I like so much though is how flippin stable it is. I *never* blue screen, games never crash, everything just works perfectly. Haven't had this level of stability from a system probably ever. It has been 1.5 years since I reformatted/reinstalled windows even and it is still just smoooooth.
I did the same, had a FX-8120, 8350 and then was gifted a 9590. Got sick of trying to keep the 9590 cool and got a 4790k that lasted me until 2021. Still is a decent gaming cpu even by today's standards but I moved on to a 10900 and then 11900k that I got for a great deal. (11900k + good z590 mobo for barely over $300 NEW)
I went from an (iirc) fx4100 which was dogshit, not even actually 4core, and couldn't run jack shit to an fx 8350 that was still dogshit. It overheated constantly even after thermal paste replacement and a box fan blowing at it. Absolutely awful CPUs and put me off AMD CPUs for life
@@nater420 You tried two cpus from the same generation and had the same bad experience and now are put off from amd? Come on now. Ever since ryzen amd has been very good.
8GB vram is probably still plenty for 1080p for several more years. Even at 2k res I dont have to turn textures down to often on my 3070.
@@Jtwizzle It's already not for quite a few titles unless you are ok with dropping texture quality down a bit (which is one of the biggest visual impacts by far).
You could run those on non-SLI boards. That was the main gimmick of the dual-GPU cards... at least as far as I know? My newest one from nVIDIA is the GTX 690 so maybe it changed later.
2006 was a time when it was better to have a fast dual core for gaming than a quad core. The problem at that time was that many games could only use a single thread. A frequently used gaming CPU at this time was a core2duo E6700, which was much faster for gaming than, for example, a core2quad Q6600. The AMD fanbase at the time was using the Athlon X2 6000+, but this CPU was a bit slower than the E6700.
But there is world beyond gaming
I still got my C2D E8500 with its GT 9800 GTX+ and 8GB of ram, suprisingly fast still today, you can work on it. It's the second fastest Gaming CPU of its time, only the E8600 was faster, and you had to pay 30% for 2x160mhz more. 3,16ghz and 6mb of Cache.
@@A-BYTE94 That does not change the situation that in 2006 there were only a few applications with multicore cpu support.
ah the q6600, i had mine for years. Overclocked like a beast! Remember selling it for £25 after Uni...went to a 4770k which again lasted me forever, but CPU's had stagnated for a long time around then, just got a 5800x3d and hoping to have struck onto another long laster!
@@joshyc2006 well i pray that your long lasting journey of CPUs continue , 5800X3D is also a gaming beast of this era/time
As someone who owned one of these. The biggest issues with the benchmarks from the day is they were done on Windows XP, since Windows Vista hadn't yet released by the time this came out. I ran Vista and 7 on the board and actually had much better numbers than reviewers did. But this setup wasn't really consumer ready, but was neat.
Biggest problem of these setups was ...... not benchmarks. It was insane inefficiency of 2+ CPU AMD setups.
Quad Opteron servers didn't even had performance of 2 CPU's. On Windows Server, it was total mess. On Linux, it was still awful.
My first home built desktop was with the FX6300... While it wasn't great at productivity stuff, it held up to High/Ultra 1080p gaming until about 2015/2016. That little chip means something to me. Surely not because it was great, but it was definitely serviceable and was my first time seating a CPU.
I had that, too! And Crossfire 2 hd6970's lol
Same here, FX 6300 Black Edition. Started out with a GT 610 and later got a GTX 1050. It wasn’t able to run GTA 5 though, would always crash on that first mission when you play as Franklin with Lamar
I shot myself in the foot by getting a FX 4170. It worked fairly ok with the games I played but took forever to encode my videos. I also didn't need a heater in the winter lol
Those Bulldozer and Piledriver CPUs are a completely different lineup than the ones in the video though, by about five years. I still have the FX-8350 and I believe Steve from GamersNexus still daily drives his FX-8370
Same! I loved that cpu! It was 2013 and I was running it with Gt630 and than R9 270x, that system put me through a lot! If there wouldn't be Covid I would never had to upgrade that poor computer. It was really keeping up with anything I wanted to play and it aged well... its now in family PC mainly for printing stuff :/, but man, it still works great! I would love to see a FX-6300 video, something with overclocking it to 5Ghz and playing modern games
11:25 Perhaps "because it is there" is not sufficient reason for -climbing a mountain- buying that product.
Ah yes, the Quad-Father. I would have killed for this back in the mid 2k's. It was pretty much a hamfisted attempt at catching up to intel's core2 quads.
only child molesters use amd.
AMD developed the platform within the same time as Core 2 Quad's development, and they released 30 days apart from one another. AMD didn't have much chance to tweak anything to stay ahead, and K10 was still a few months away. They banked on the "FASN8" program where K10 would be a drop-in upgrade and allow for 8 cores, and once 65nm yields for K10 proved to be a nightmare and the whole architecture had to be cut down to fit (axing most of the L3 cache) they cancelled that plan.
Anyone remember the ASUS MARS GTX 295? 2x GTX 285 spec chips with 4 GB VRAM versus 2 x GTX 280 spec chips with 1792 MB. That thing was a beast back then! A very expensive one I could never afford though.
As I recall, it may have been called Mars, but it ran as hot as Venus!
The 760 was the card I bought because I ditched SLI, finally recognizing that the fps counter was meaningless if it played like crap.
Never thought someone would be making a dual gpu care in that era .. let alone buying one , let alone midrange cards :).
I went with a 2x hd3870 (dual gpu in one card ) at the start , single hd4xxx series , hd5850, dual hd5850, dual gtx560ti. I finally leaned my lesson by then to just get a single card, no matter what .
I think you should have gone with the 7950GX2. I always dreamed of that card back in the day.
Yeah that or 9800GX2 would have made sense for the era. A single GTX 760 is overkill let alone quad GTX 760’s 🫠😂
@@rare6499 100% the reason they didnt is because both the 9800GX2 and 7950GX2 are "extremely" unreliable and hard to find working nowadays
The Vega VII didn't paper launch. It still is a beast for crypto mining comparable like a more efficient 3080. They were mostly bought out before reaching consumers. Have seen my fair share of Those fragile beasts
Im amazed a 4-year old GPU is still so competitive at mining
I'd love to see a video where the challenge is to get 60fps in a handful of modern games using the oldest hardware possible, including OC and used parts! Good reference to see what still holds up
The 1080ti would be the undefeated goat
Back In the late 90's, I had an Abit BP6 dual socket 370 Mainboard, with two Celeron 466 cpu's overclocked to 525 MHz. The only problem was that at the time the only SMP O/S available to me was Windows NT4 Workstation, which barely supported any games. But it was a very cool mobo at the time 🙂
When my current PC was first built it came with an FX 8350. Compared to the previous PC's Phenom II 1055t, only BeamNG and Space Engineers saw any improvement at all and some applications even got worse outright with stability issues. I didn't hesitate to upgrade it to an R5 2600x the moment affordable Ryzen MBs started being sold in my area. The improvement was night and day. My GPU wasn't getting bottlenecked anymore to the point I was even able to upgrade my monitor setup. (There was a whole other can of worms after that, but it was more due to ASUS related shenanigans than a fault of the platform itself.)
I remember that era well. I was an AMD fanboy up to that point. Went from Athlons to an Opteron 170 as my first dial core CPU. But, yeah did AMD drop off a cliff fast. When, it was time for my next upgrade, I really had no choice but to switch to team blue and the stupidly good C2Q 6600, that thing was an overclocking beast!
I had dual Opteron 270 system for my gaming rig around that time and it did rip copying data on sata drives with two nforce 4 chipsets. It did make a difference against DDR2 dual cores at the time for some reason until quad cores from AMD came out.
That double cut shot under the table about his masochism was great, keep improving your skit making LTT crew it shows in the product!
I have a still functional system using the the non-ECC version of the board with the FX-74s (the PCIe slots are blue vs red), and I actually liked it overall. Yes it ran hot, but I had the thermaltake vortex coolers to handle that (which look pretty killer too i might add), and I later used liquid cooling specifically for the GPUs (nVidia Quadro 4000s, later upgraded to K4000s). It can actually play Doom 2016 at 720p on moderate-ish settings (deffs not 60fps, but playable). You can install Windows 10, and even Windows 11 if you remove the TPM requirement. WIndows 11 is pretty awful performance-wise though. I think the biggest killer was that the RAM slots and onboard devices were known for going bad or being bad from the factory.
I remember on cold winter nights i would literally drape a blanket over me and it (at the foot of my chair) and just use the machine to keep me warm. lol.
I had a Athlon X2 black edition in an am2/am2+ asus board. Later upgraded to a phenom ×4 955 black edition. Used that for many years with a GTS 250. First gpu was 8500gt.
You should do a video build with all the oldest computer components have in stock and compared to a new systemy
I have a MSI k8n master2-far with 2 opteron 280, and it is similar to the quad fx platforme. Really good on the paper (SLI, 12gb DDR400 ecc, 2*2core cpus, 2gigabit lan, on a e-atx form factor,...) but released in 2005... 1 year too last for it to be interesting.
That screen tearing edit was glorious! That editor needs a raise :D
the edits seem old i had to check when it was released like the music in the background is so loud
Based on the performance charts.. I wonder how they even Marketed this Quad FX then since it was still not good? Did any applications use more threads back then? Some type of engineering program, for example?
Small office file and service server market, not for home. These systems in tower cases would be sitting in closets in offices.
Next up: one of those old (circa 2006) Tyan dual-core Opteron motherboards with 4 CPU sockets plus the Tyan M4881 daughterboard that adds an additional 4 sockets for a total of 8 CPU sockets and 16 CPU cores in one machine
i think it would perform a lot better with the dual cores 3ghz (4 cores effectively) and definitely dual channel ram and a 1060 at most
Hey Linus, i still use my Asus KCMA D-8 which uses a dual opteron 4130 setup with ddr3 ram and use it as a home server setup, A setup from 2009. I'm so glad you are enlightening the world about the world's best platform.
You're forgetting about the Bulldozers, I took the risk of buying a "FIRST EVER 8 CORE DESKTOP CPU!" instead of that lame quad core 2600K from intel, boy did I pay for that, the FX-8120 I got overclocked all the way to 4.9GHZ on air but was still 30-40% slower than the Phenom II x4 955BE I upgraded from
I remember there being some really cool build back in the early 2000's that used multi GPU and multi CPU setups. I would really love a video about what people were doing back then with those.
I remember the Pentium 4 days. One of my friends had the 3.0 Ghz and it was a pain to cool. I only had the 2.4 Ghz
13:58 props to the editor who put in the effect, subtle but nice.
Linus needs to try use a single core celeron lol... Then you'll feel the power of wasted silicon
It could be 100x more painful for him than his recent "cheapest laptop ever" video
@@sihamhamda47 I never knew a browser window lagging out was possible on modern cpus till I experienced the abomination they try sell
9:57 this video was a month+ in the works... dam... [this was recorded when the "Custom 'YTX' gaming build" was first uploaded]
9:52 bottom, right image 💀
I had an old crossfire setup with a pair of 290x 4gb cards on water, that I used until 2018. Once I took it down, one of the GPUs simply would not come back to life on it's own, it would recognize if I had it in a rig with the other one but it was totally DOA, no picture nothing, when it was in a rig by itself. When I had it as the primary GPU it would do that same thing with the flickering textures and other nasty problems, clearly it had some bad VRAM but also I think the output engine was broken because one of the outputs didn't work either. I had no idea anything was broken until after I took it apart lol
My old PC had an AMD FX-8350 Eight-Core Processor from 2012, and it honestly wasn’t all that bad. Yes, it ran very hot and really raised the temperature of my room, but it was snappy, great for gaming, and handled anything I threw at it. It still performs well today.
if you have a gtx 1060 or lower then yea its fine
@@C0mmanderX The old PC had a GTX 960
That could be perfect for winter setup for those who lives in very cold climate
I used to have a fx-6300 and the computer sat near my leg. Could be shirtless with the AC on and still be sweating
same, to this day i am still running an i7 3rd gen, works fine
still have a FX8350 around somewhere, was great fun to overclock (5.1Ghz air all core iirc) but while tinkering was fun getting things to run without frame time issues was near impossible, at least on my chip.
It was so damn hot tho...
my first computer was a dell optiplex 780 sff with a core 2 duo (unsure on model) then upgraded it to a core 2 quad q6600 and damn it went for what it was. great little computer and i got it for $30aud way back lol
ive got one of these laying around, it comes with an E7500 stock
@@lands1459 interesting, the guy i bought it off mustve got a different cpu at some point or as an upgrade cause after a double check it had a E8400 duo @3Ghz
My 15 year old AMD PC the HP dc5850 was great. It had a AM2+ socket with a Phenom X3 and later a Phenom II X4 B97 (4x 3.2GHz); max 8GB of DDR2 (800MHz) and 2nd gen PCIe with a 128GB SSD on SATA-2. I used it till April 2019, when I replaced it with the 2nd slowest Ryzen ever (Ryzen 3 2200G), but still a huge improvement 2x the speed of the Phenom II X4. That 2008 HP dc5850 is still in use with a friend.
I love the AM2+ and AM4 motherboards, they both supported at least 2 generations of CPUs.
Please check into CPU latency ! From both team red and blue
Don't think there's too much difference, as the Intel quad cores at that time were simply two dual core chips on one package with no direct connection. Much like this dual CPU setup from AMD.
Anyone else noticed a weird pink line flicker at 16:42 right above the FX-74 frequency?
Even though it sucks, it sure is awesome. This is one of those things that you build today just out of pure fun, because you can. Multi GPU and CPU setups were absolutely a blast to play with back in the day. It usually wasn't practical, but it was a blast to tweak.
I've never heard of quad FX before. Apparently they could do well for productivity, but that motherboard is a weird one. Why does it look like there are 2 different RAM clips installed(6:48),this sounds so unusual, especially if there's just 1 type of RAM supported.
A question for you: Is this motherboard able to give the CPU's a nice overclock? How far could you go? The clock speed of the newer chips was much lower, probably to stay within given power limits.
The RTX 3060 has drivers for Win7, but you will need to uninstall the older drivers first with something like DDU.
The editing in this video should be the gold standard for LTT, they even impressed me with the quality of seizure warning. It is appricated when they warn people about loud sound or strobing lights.
@ 4:59 : I presume the article was written about the same of reviewing the product from about ~2006??How comes there is a commercial for a Core i9-11900? The 11900 wasn't released until about 14 years later.
I remember the Mars 760's. I honestly almost bought one. Kinda wish I had now cause I collect dual GPU cards and finding them is incredibly difficult. I would have hated myself at the time, but I would have one now for my collection.
I didn't recognize Gary till you mentioned ASUS, met him at HardOCP's Gaming Convention thingy. (Met Kyle Bennett too) both awesome guys!
Also I still have an AMD FX-8350 + GTX 980 as a spare gaming rig. Also have a C2Q 9650 that runs a GTX 960 (before that it was a 560 Ti)
I remember upgrading from an athlon xp to an Athlon 64 FX-74 and it was such an incredible upgrade. I also remember going into fry's and the sales rep trying to sell me on quad fx (or more like trying to sell my dad on buying me quad fx) . Glad that didn't happen lol
How did you upgrade to an FX-74 if you didn't buy the Quad FX system? The FX-74 only works on Quad FX, it was made specifically for that board and was only sold in pairs (2 CPUs per box).
@@FixedFunction lmao you made me go dig out that old pc (in the garage) and to my surprise it's an FX-62, guess I got it mixed up (I was like 12 at the time /shrug)
I have a retrocomputer with one of those! I purchased it in 2017 for a celebratory bonus build in parallel to my TR 1950X build. I have a number of retrocomputers with... intentionally weird and questionable parts, including the infamous nvidia leafblower (FX 5800 ultra) and a 3dfx Voodoo 5500. The Voodoo 6000 and XGI Volari Duo both evade me sadly.
...I also have some systems with modern weird and questionable parts too, like a working system with Via/Centaur's last x86 cpu (the CHA/CNS chip) before Intel acquihired Centaur.
im honestly surprised and impressed AMD managed to crawl its way back as a competitor
for sli to work properly you'd need an equal bandwidth pcie bridge aside from the motherboard one (16x bridge for 16x pcie connector).
I love the dual socket MB they just look cooler
Probably hotter actually
New Chinese dual Xeon ATX motherboards are 100 or so, the 4-18 core CPUs pulled from big server farms are very cheap. X79 uses quad channel DDr3, X99 uses dual channel DDR4.
You can buy 19" rack dual CPU server slabs for 150 upwards from A******n , they generally have two PCIe slots at the rear, also a power connector for GPU lead. fit a USB3 card in the second slot. the biggest caveat is they have lots of small fans and are jet engine noisy at start up..
10:47 (Seizure warning)
Most of my computers since 2010 have been 2-4 way SLI machines. In my experience, that exact flickering is caused by a bad connection of the SLI bridge, due to oxidization or whatever. Usually just wiggling the SLI bridge around should fix the issue
FX is a really cursed name for tech...
9:20 Linus WTF have such good voice. We need him to sing Karaoke in @channelsuperfun !
time for a best CPU EVER VIDEO
Amazing to see Opterons in the channel. I remember I was this close to buying a G34 socket server motherboard to get some of those cheap CPUs... But you can imagine why I didn't for it. It could've been incredible for server things, but Ryzen going more affordable and more reasonable (even Threadripper and Epyc) basically negates all purpose, unless I go "just for fun".
I love these legacy hardware reviews!
I did play with the green flickering for some time. I had a Radeon 240, I think. To run games like GTA 5 at 30 fps, I overclocked it to the point that it shouldn't have worked. To fix it, I had to tinker with bios settings. I have no clue what I did 7 years ago, but I broke it a year later.
The Quad FX looks like something out of a *fever nightmare*
Your profile picture looks like something out of a fever nightmare
Now I'm thinking, could you slap together four of those Mars GPUs to get octuple SLI?
No.
@@MR_FIAT I was guessing so, but it was worth asking. Care to elaborate on technicalities?
Like for the warning!
Retrospectives of components from that era always makes me want to look at old archived Maximum PC magazines.
That still powerful from my 10th gen i5 😀back in then, amd decided to put more but not so powerful cores into processors. İ guess they were like, ohhh mate if we add more cores it will be more powerful
how
Swing and a miss my guy, those slightly less powerful single-core but better multi-core CPU for coming out when 8th and 9th gen were a thing
@@doublevendetta ohh i Dont really recognize nor i have Heard quad FX thing but i didnt think it was that New i think that multicore thing is 2011 thing. So sorry if i was wrong
Man these were the days though that overclocking WAS necessity out of the box. Where you'd instantly get 25% instant increase... it would be interesting but I don't know how safe to do with its age now.
Will you be auctioning this off at some point!😂
the radeon 7 was not vaporware, it just was completely grabbed by miners i assembled rigs and in personally mounted hundreds of these, at the end of ETH life you could push 100mhs EACH at less than 200w power draw it was not until the 4090 arrived that i was beaten with 110mhs but at OVER 300w so in the end the Radeon VII remained king of GPU mining
11:51 Gary looks like Linus’s disappointed father right here 😂
9:18 I thought Linus was going to break into song. I was slightly disappointed.
as for the actual topic of the video, I used to run SLI (dual gpu)... I can't imagine throwing two more gpus, and a cpu in. It is bound to fail.
I don’t think anyone would invest that much in old parts. That’s why LTT did it for us.
I would suspect the Numa boundary on the RAM between the CPUs would also cause a tone of issues/slowdown.
People spent nearly 3 grand on 390's, you can't believe people bought these things ?
I have to wonder as to the GPU choice for upgrade that was tried here. I think a more appropriate one would be a 1050 or RX 580. They both would beat the crap out of that 760 and have better support under W7. Consider giving it whirl even if just in the lab.
10:25 but you are using an old edition of chrome on windows 7 so how much is it the computer and how much is it chrome not liking windows 7?
Pls don't say FX, pls don't say FX, pls don't say FX, pls don't say FX. PLS!
1:20 later. Ohhhh crap. Sorry honey, none respect you. Only me!
My first CPU running for at least 10 years. AMD FX 8150!
I just had this running in the background while assembling a bike. I wasn't paying attention, but loved the throwing of Gary under the bus over a past CPU choice!
Could always try a Quad Opteron setup based on the Piledriver platform using socket G34. Can get Operon 6380's relatively cheap compared the next step up, the 6386.🤔
15:09 hahahaha oh shutup u never believe that 😂😂😂😂😂 poved the way Linus contaminated everyone ❤😂😂😂
I had a pair of GTX 260s in SLI ages ago and in games that supported it well it would render on both GPUs and scaled well enough but it turns out the bottom GPU didn't output video on any of the ports and I have no idea if it had always been like that or if not for how long it was like that lol
11:43 heck yeah, love a good Linus Drop Tips montage.
How many LTT vids haven’t I noticed are dubbed or subbed in other languages. Is it something they retroactively on older vids, I haven’t seen it in new release.
Only n00bs watch dubbed.
I like RAW English.
i wonder if its viable to find a cheap broken grapics card and use it for SLI
Without the context of other platform, such as the Core 2 Duo, i can't tell if it is actually slower. While you guys told me it's slower i wasn't shown that it is. I think a better video would be to run the original quad fx then compare it with this upgrade path and its competitor. Then again it would be buying a bunch of e waste so im not sure what i would have done
Yt playback is something I've started to notice problems with my 4th gen laptop i7. Fans need to kick in if I have the webpage visable. But embeds or minimizing the page is fine.
The page itself is somehow a problem
11:15 The way Gary said, "yes it is!"
My word, he should make audiobooks
Shout out to the optical audio users out there at the time! (I was only one due to a Best Buy warranty gift where I got "upgraded" after buying a floor unit for $100 to a surround sound $1100 unit, but still. It was amazing.)
Getting Event id 86 with random shutdowns. tried reinstalling win 11 and clean installing win 11 and even win 10 but to no avail. Anyone else is having the same problem? We need its fix cause many guys with ryzen cpus are facing it.
I wonder if the 3060 didn't run due to missing SSE4/4.2 instructions maybe?
As you say though, it would be massively overpowered -- running Ubuntu (with wine/dxvk/vkd3d for gaming) on an Ivy Bridge (which has about 1.5-2x the performance of QuadFX per Google), GravityMark can max out the card but most games (that are not limited to 60FPS or monitor refresh rate) will top out at about 40% GPU utilization. The framerates are nothing to complain about but CPU core that's driving the card will just peg out well before the GPU breaks a sweat.
I did get CP2077 up to 80% utilization by turning some of those graphics settings that were set to low up to medium though -- nice to be able to turn up settings with 0 FPS drop since it looked rather bad on low to be honest.
The thing with platforms like these is that they don't have a solid upgrade path which brings things to today with the Threadripper platforms. Essentially every generation of Threadripper. Sure the first generation got an upgrade from 16 core to 32 core max but they were the same Zen 1+ architecture. Then then was the TR4 socket for the Threadripper 3000 series which never saw a new chip released for them. Ditching the previous dead end socket for another dead end socket.
Then there was the Threadripper Pro which added yet another socket to the Threadripper history but hooray it did provide a generational upgrade with some gotchas. The Threadripper Pro 3000s were mostly OEM only with off the shelve parts for DIY builders only appearing shortly before OEMs had access to the Threadripper Pro 5000s, thus making Threadripper Pro 3000 in the DIY area kind of rare: an upgrade path did exist for the dozen or so people who had one. While the consumer and server segments of AMDs line up got some 3D V-cache parts, Threadripper again was neglected. The saving grace for Threadripper Pro 3000 and 5000 is that the many DIY motherboards can also work with Epyc chips. One downside of being mostly an Epyc chip for Threadripper Pro is that they can be locked to an OEM platform, complicating the used market.
9:55 I think the delay there says less about the performance of the platform and more about how modern webpages are overloaded with scripts, tracking and other shizz...
I can't believe it took this much from AMD to match my i7 620m laptop CPU. 8 core can't compete with 2 core 4 thread goodness. I guess the reason is because I'm not really multitasking, I'm running Linux Mint, and I have 1.33GHz spared for each core rather than 550MHz per core on their end. I can handle about 1440p30 or 1080p60 before my cpu chugs. Is it weird to me that 1080p60 is sometimes harder to run than 1440p30?
If there is one thing that being a long time AMD user has taught me, is that if it has FX on it's name, avoid it like it's the plague
I'm old, so I really love seeing you guys look at older "high end" stuff I could only dream about back when I was a kid 😂