Don't miss out on designing your dream set up with FlexiSpot.🌟 Use my exclusive code '24AUG30' for extra $30 off on E7, E7 Pro, E7L. and you can get discount on their ergonomic chair with the desk purchase FlexiSpot E7L standing desk: bit.ly/3yo2sun -US bit.ly/3LQvQwq -Canada (also join the discord. Find the link. I dare you. Bet you can't.)
@@annebokma4637 sheeshes lol tho i happen to know a company building these 8 * h100 servers with 2 96 core zen 5 epycs. Those things cost more than my house
Old Xeons were mentioned. I am running a dual X99 rig with two E5 2699-v3 CPUs from 2014 that are each 18 cores, 36 threads, combined as 36 cores, 72 threads. Each CPU was only $36. Its got 2x4TB m.2 drives, 128GB RAM, and an RTX 3060 12GB GPU, running Win11Pro Lite. It honestly handles absolutely everything I want or need, and there isn't a moment of any hesitation or delay. Even games have very playable FPS.
The problem with older server CPUs is their single-thread performance is horrendous, and it would bottleneck your GPU and lowest 1% FPS This is a great system for multi-tasking (encoding, rendering, tons of chrome tabs, containers) but can't recommend it for anything requiring fast single-thread performance.
@@alpuhagame It's not my daily computer. I have a 14700K for that. I built it solely for the purpose of building a dual CPU box as I have been wanting to do so since 1999 when I started a dual Intel socket 370 build, but couldn't justify the cost of the RAMBUS DRAM compared to my at-the-time paychecks (lol). When I saw AliExpress had $36 Xeons with 18 cores, and dual CPU X99 moboos for $89 I couldn't pass it up. However, even with gaming, I see 100 to 300fps @ 1080, so it's more than capable, however, I could use the box to heat my bedroom in winter. With that in mind, it was a fun project to try, and I am just happy knowing I built a dual CPU rig after all those years. The box sit powered off most of the time.
I have a dual 16 core x99, even with a hacked BIOS to unlock the multiplier I'm only between a 3950x and a 5950x on CPU-Z Benchmark. And you can snap up a 14700 non-K for well under 300 that blows it out of the water, pair that with the $30-80 Z690 boards I keep finding and the X99 is only good for the LRDIMMs approaching $0.50/GB in a 32GB stick. I'll put it this way, at 240watts I can do the blender BMW render in ~24 seconds using the Thermalright Phantom Spirit. And for single threaded I am getting more than twice the processing power of E5 V3 out of the 8 P cores.
And Yes for gaming just get a 12400 for $100 or a 12600k for $150 or so, and realistically a brand new 12100F is still better value than these Xeons/Opterons 😂. But I just built two systems each for under 900. One is 3080, one is 6800XT, both have 14700 and 850w Z690. ARGB fans etc.
Get a cheap modern CPU and you'll see how far we have come since those old server CPUs. You think everything is great now. Drop in a new i3 or Ryzen 5 and you'll be like wow. Didn't know there was so much performance left on the table. Granted you can't beat $20 for a CPU. But it's also cheap because it's not worth anything.
issues here . low clocked ram coupled with the fact you are not running all ram channels . also numa nodes split into 4 8 cores . also there is a setting on the board which will let them turbo as high as possible on a few cores while dropping the unused cores to lower clocks :) i used to work on thes and once properly set up with ddr3 faster then 800 mhz or 1066 they perform better
I was about to say, given the era that these came out, the size of the package, and the amount of power they drew, I would not be surprised if AMD just strapped two smaller operons together and called it a bigger opteron.
When I was selling several hundred CPUs and RAM modules a week, the opterons stacked up and I couldn't even give them away... Waste of time to even list them.
@@aChairLeg Naw, I had a supplier in a recycling center that was buying them from his employer for cheap and offloading them to me in bulk. He mentioned one day that he didn't have as many as usual because he didn't want to "get in trouble", so I had to sever the relationship. Real shame...
I recall these being really good for compute density for VM's that didn't need to be super fast, when I worked for a large banking company we ran a few full racks of supermicro 2u 4 socket boxes with 6376's and as much ram as we could squeeze into them and these were used primarily for XP virtual machines for some very old software that the company didn't want to spend a few hundred million to get remade (some of this is hardware costs tied to having to replace a mainframe if memory serves) and so on, and honestly for that, they did really well, I don't work there anymore but last I heard they did wind up remaking that software to run on modern OS's and have since consolidated that into a pair of 2U 4 node epyc boxes haha
I've seen them out perform intel at the time for media encoding... by a long shot as well, talking 800kbps vs 200-300kbps of encoding, it was a lot back then
I did 3d modeling on a "work station' mother board 4 cpus and 128 gb ram. I think the bus speeds are what really hold it back from being faster than my new laptop 10 years later.
@@aChairLeg As an owner of not one, but TWO 140W Opteron 2360 SE on a Anus KFN32-D SLI which hints at another set of hotplates that is the nvidia nForce Pro 3600 chipset plus the I/O companion chip, I wholeheartedly agree. Oh and did I mention I paired it with not one but TWO Geforce 8800 GTX in SLI that add ONEHUNDREDANDFIFTY watts ... EACH? PS: please send help 🥲
@@aChairLeg They are not beautiful in any way shape or form. I had 8 Servers having dual opterons running Debian 4 back in the day. Calling these systems unstable is an understatement.
I assume it shows 4 NUMA-nodes because each chip actually has 2 seperate dies on it (so 2+2) And indeed after the lawsuit, CPUs of this architecture can pretty much all be considered to only have half the advertised cores, each "core" simply takes too many resources from the other that you can't really call them separate
As a gaming machine, yeah, it's not gonna do much, but for a home server/docker setup, it'd likely get the job done relatively well depending on what programs or containers you were running.
I do have a dual socket Opteron 6380 server that has been my main hypervisor for the *longest* time. I paid $10 for each of the CPUs, and I got the rest for less than $150. To be fair, it does everything I need; but man, it SUCKS power like it's going out of time. EDIT: You said you got the same for $10/piece and I feel validated lol EDIT2: You also said the bits about power; validated again lol
It just depends on the power management on the board and voltage regulators. i have a supermicro 1 cpu opteron system, and thew whole server uses 80 watts. But my dual boards will never allow the power usage to go this low, as the lowest power usage is always above 220watts. Now i just use the HE or p versions of the cpus. I keep core boost off, since it over volts the cpus.
The big thing at the time for opteron was the ungodly amount of memory bandwidth available in comparison to other cpus. It made for a really solid system where latency as measured from network input through cpu, disk, cpu and back out to network; not just lower, but more consistent even under high connection counts. This was right around the c10k challenge era of the tech boom iirc.
Oh... I should probably also mention that (with caveat that I didn't work directly with opteron, but had friends who did), the openly available documentation for the opteron at the time was rather complete. It is one of the reasons the opteron was just a workhorse that you didn't hear much about... shit just worked, even on Linux, bsd, etc. day in day out.
Messing with Opteron was always an interesting idea to me, but after seeing these results (and the price), I'm glad I just bought and upgraded an R720 for like 250 bucks lol
OzTalksHW did this, so did a few other Low Spec tech channels. The consensus was that their single core was way too low for gaming, although they could game a bit, and even the sheer volume of cores was slower than modern 8 core CPUs. It was a fun experiment to watch. And it's also a fun experiment to try out. That said, those prices reflect availability.
I'm glad multi cpu workstations are mostly a thing of the past now. My workstation PC used to be dual CPU but since 2021 i only have a single CPU setup for engineering.
I daily drove a Dual Xeon setup for a long time as my daily in an old Dell Precision I want to say T5400 series precision, cant remember the series it was, It out preformed my, year old at the time, 1000 dollar processor and motherboard by over double and maybe more. It got done with a week plus Handbrake Workload in less than a day, and it only cost me $200 shipped to me. When my processor died a year later I engineered some cables for the Video card and it was a work horse ever since. I think the reason it annihilated the workload was it was 4x4 single thread twin processor setup with 1MB of lvl 1 cache per core. At the time most processors didn't even have that much cache total on the die, thing was a monster. It's still running right next to me with a debian install, and when I had to ship out my new PC it still played most stuff great with proton. I'm going to set it up as a render computer at sum point and get it back to work. I want to add I'm not shitting on AMDs line, just people shouldn't look down on the old workstation stuff. If you know what you are doing it can really work well in anything.
I thoroughly enjoyed those systems back then. SLI/Crossfire, loads of ram. The games i played were more GPU bound and while there was faster back then for my enterprise workloads, they weren't cheap. Funny the Phenom based 12 cores were faster than the 1 for 3rd gen based on piledriver so i put them back in my home build. I had 4 socket setups as well and plenty xeon based systems for work but at homeI only used these. Theres a Supermicro Board with modded bios u can use to overclock those chips. Been an amd fanboy for the most part since socket 7 (k-5) but ever since i built a dual CPU Slot 1 (Pentium 3) desktop, i never setup a single cpu system until Threadripper. Typing this from EPYC based Desktop.
Another old xeon mentioned. I have a ThinkStation S20 with a Xeon X5675 and R9 290X. The xeons were very cheap and same with the gpu, and it was definately a great workhorse. It's not my main system, but whenever I use it, it's a pretty nice system. It's running Windows 7 but no doubt, it will definitely take Windows 10 just fine.
I ended up with 3 seperate opteron systems at one point. They weren't bad for the cost. But modifying a case to fit one of the quad socket mobo was insane. That one's dead now The single socket one got deployed as a nas And the dual one has 6 gpu in it running ollama, it is slow to load up the models but once everything is in vram it is very usable.
MY old FX-8350 System i only Retired in 2024 ran great for last 10 years (over 92,000 hours continious use nearly) Also that was overclocked 800mhz - Ran at constant 4.8ghz @ 50*c for 10 years.
I have 2 XEON systems that are pretty decent. the CPUs cost me 20 bucks and I installed them on regular ASUS mainboards after modding the BIOS. Intel Xeon X3470 on ASUS P7P55D /w 32GB DDR3-1600, GTX1080Ti Intel Xeon X4570 on ASUS P5Q Pro /w 16GB DDR2-1600 GTX1060
I have been on team AMD for ages. I intentionally was rocking 6 and 8 core processors when 2 and 4 cores were standard. Those CPUs aged like fine wine as Windows scheduling got better. Single core hasn't been my primary concern since 2006. My latest build was an AMD 5800x with a RX 7600. The cost/value ratio is unbeatable.
What a trip down memory lane. I remember being a fairly junior technician in the early 2010's and having to work on dell chassis that had these exact cpus. They were complete garbo. CPUs would be an often issue with any of the servers that were in repair. Frequent issues with memory training. All said they were heat-buckets and a sorrowful CPU to deal with. 10/10 do not recommend lol.
Oh man.. I bought two Opteron 6386SE and a Supermicro H8DGU-F because it was the only affordable motherboard with 6 PCIe slots I could cram into giant case and shove it into our storage room. Loaded that bad boy up with six GTX 1080 Ti's back in 2017. You're damn right I was crypto mining. Was absolutely not worth the headache and the heat that thing spilled out. Surprised it didn't burn down or blow up my two PSU's it required. But honestly the Opterons cranked out some CPU optimized coins quite well for how old they were by the time I'd bought them.. Yes I was definitely not paying for heat or power at the time!
The issue with the Opteron server machines was the "inertial" of folks being used to Intel in multi-core servers, and many companies taking a while to even find out Operons even existed.
I had opteron at a time, and most bad thing about this cpu is single core perf. I mean in cpu-z it scores 150, when core 2 quad scores 200. Also cpu is build with 2 slices of silicon, so its actually 2 cpus welded with some hypertransport link, so the latency between cores suck ballz. You can turn on "cores interleaving" and "memory interleaving" in bios, so all cpu nodes will act like "one node" - it changes how pc behavies a little bit.
I somehow managed to end up with a dual socket Opteron board. The older ones before the package had a large footprint. Ran several Minecraft servers on that system. Had to play with core affinity to get the performance required. Sold it. Buyer let me know it died about 6 months later.
I have a machine that I paid for with my own cash in 2012. It has 4x 16 core 6272 CPU's in a TYAN B8812F48W8HR motherboard. Power consumption was super high and super loud. I still have it but haven't powered it on in a while.
Good video thanks for being so honest. I looked up the specs of this ship on Passmark and the single core score was pretty low - only 1334. This is kind of low for gaming, these days anyhow. I can see now why these are not preferred or equal to some of the xeons out there. Take care.
The 6600XT is a weird choice not for any type of power but because it has a PCIe 4.0 x8 interface and you are plugging it into a PCIe 2.0 board. It will be bandwidth starved. PCIe 2.0 x8 is, well, slow. Any 3.0 card with a full x16 interface would have been a better choice. This would be Turing and before on the Nvidia side and Vega and before on the AMD side. Although the Radeon VII Lisa Su 16GB edition would also have been an awesome choice. It's PCIe 3.0 and it has 16GB of HBM2. Best bang for the buck? RX 480/580 and or Vega 56. But if you wanted to run 4k to relieve the bottleneck a 4090 would have been fine as well.
4 Numa nodes because it is litterally 4 CPU's. 2 per package. You need to make sure you spread the dimms across all ram channels or some cores will be without local ram.
4 NUMA nodes, they didn't have an IO die then, that didn't show up till zen 2. Even Zen 1 had this pain. So if you had a memory sensitive app, you better hope it doesn't need to leave a CCX because over the infinity fabric it went. I have an Epyc 7551p and vm scheduling is a pain, Zen 2 was truly amazing.
What exactly did you expect from something with a single core performance at best comparable to a i5 750? :P I also don't think Games appreciate none unified memory access. For those that don't know what it means in this case: Each CPU has his own RAM allocated to it but also allows the other CPU to access it, but accessing the memory of the other CPU takes longer than using the own one.
Really OLD stuff, has a really bad single thread bottleneck issue. Those times. Upgrading from i7 920 to 3900X simply multiplied my per core performance by factor of 3 for code compilation. And yes, single threaded performance is important for coding. If I change a single source file and then test changes. It runs a single copy of a single threaded compiler and reuses results of previous compilations of everything else. And that single file did take plenty of time because it was using header files that took time.
This processors where not great for gaming, they are however pretty good for server workloads, even if you are trying to use them as extra horsepower as the server trying to render CAD workloads. But for ease of use and dollar for dollar you might be better off with a classic Xeon in this instance. I had a similar setup in my machine shop that we used for CAD processing and it worked pretty darn good, but after a few years Ryzen launched and well many of these old server cabinets became obsolete.
Even a "first gen" (Ivy Bridge) LGA2011 build with something like two 10c 20t E5-2690 V2 for 15 bucks each would beat the crap out of this faildozer-based abomination for that purpose. Or just... get an AM4 board and a 3900X/3950X/5900X/5950X if you need a bunch of cores for cheap without the 300+W power draw of an old dual CPU rig. Depending on the board you get ECC memory support too
1:50 You got the problem right there. The Opterons were just as shit as their desktop chips at the time. The Opterons are losing to Xeons with LESS than HALF the core count and three quarters of the thread count. Edit: And those Xeons in the benchmark were a generation old. Intel released Xeons with 10 cores, 20 threads on the lga 1567 socket the same year AMD released those Opterons. I have an old Dell server with 4 of the top spec Xeon E7 4870's. It is quite a cool piece of computing history. Very mind boggling for the time when 4 core CPU's were becoming the norm. And with the practically identical E7 88xx series, you were allowed to have up to 8 CPU's working together for 80 cores, 160 threads and up to 4TB of memory. Quite insane for 2011. Edit 2: 8:42 Those CPU's have 2 numanodes each.
I'm pretty sure for correct performance you need 8 ram sticks for 2 of these chips, not six. Mind you, it's still going to be slow. Just not quite as slow.
Might even need the full 16, each CPU is 2 octa cores bolted together and each octa core module has it's own IMC, so 8 sticks for dual channel, 16 for quad
You say that opteron CPUs do not support avx2, but the lga 2011 v1 and v2 CPUs also don't support avx2. It's only starting from lga 2011 v3 CPUs that we get avx2 support. I will not deny that lga 2011 CPUs are much better though, because I am running e5 2697v2 CPUs in my server. Would have preferred that you would do better research.
I worked in a datacenter when these came out, just hearing Opteron causes me pain. We had so many customers that had all kinds of problems that ran these. They were decent for running VM's on but but for the price/performance now, not really worth. I see other people explained the reason you see 4 NUMA nodes so yeah. Of course AMD turned that all around with Epyc (and Ryzen), but that's a different story.
FYI you might want to see about that core count being true. I'm too lazy to check but it might be shared components between pairs of cores just like the bulldozer garbage.
Athlons and Opterons were phenomenal chips when they came out. Intel was a huge mess with their Netburst architechture, and AMD ran circles around them technologically. It was only illegal shenanigans from Intel who refused to sell chips to vendors that sold AMD chips that stopped the freight train dead in it's tracks that has run again with Ryzen processors. Athlon/Opteron and Ryzen were both revolutionary designs from chip god Jim Keller. While Ryzen has absolutely dominated Intel, massive power and loads of low power cores have kept Intel vaguely competitive. The same can't be said of the Netburst architecture. At that time Athlons and Opterons were double the speed of Intel's equivalent, and also for the first time CPUS were absolutely rock solid in reliability. They also brought in multicore and 64 bit processing. If it wasn't for illegal tactics Intel wouldn't have been able to sit on their arses for 15 years spending nothing on R&D, and would likely have been dead and buried. AMD wouldn't have had to sell off their fab plants to survive. It would have been a different world.
You totally buried the lede. The Libreboot thing that you skipped over like it was nothing is the main attraction of using these old Opteron systems. The KGPE-D16 is the most powerful X86 system made that can run a totally open source BIOS, and most crucially _the most powerful X86 system that doesn't come with a spy processor built in._
5000 passmark for 115w power draw i mean sure, if you're stuck in 2010 but these days just buy a crappy $100 mini pc with an n100 which sucks 6w and has video hardware acceleration, no noise, uses as much electricity has an LED light bulb cons: wont heat your room up in winter old hardware is no longer viable, it's not the upfront costs that make it so but the running costs, performance and noise - which is why they are trying to give this stuff away for free if you want good value, efficiency and performance just get a ryzen processor, unrivalled if you want to do something really cheap then buy a broken ryzen laptop build a desktop/server out of it
These old processors are pretty inefficient. Low cost yes, but you'll need a huge power supply and gflops per Watt are no comparison to modern processors.
Don't miss out on designing your dream set up with FlexiSpot.🌟 Use my exclusive code '24AUG30' for extra $30 off on E7, E7 Pro, E7L. and you can get discount on their ergonomic chair with the desk purchase
FlexiSpot E7L standing desk:
bit.ly/3yo2sun -US
bit.ly/3LQvQwq -Canada
(also join the discord. Find the link. I dare you. Bet you can't.)
AMD should cancel Zen and its ++++ and return back to Faildozer.
Have you ever done a bp6 dual celly overclock build? 😎
I use to run a dual opteron back then lmao it was like a 15000 usd computer
@@98f5 ssshhh people like to think computers are more expensive now 😎😁😁😁
@@annebokma4637 sheeshes lol tho i happen to know a company building these 8 * h100 servers with 2 96 core zen 5 epycs. Those things cost more than my house
This immediately made me want to make a video on an Opteron system. After 10 minutes on eBay, I regret it.
You should do it! In fact...I know someone who happens to be getting rid of an Opteron motherboard and CPUs! What a crazy coincidence
you should try xeons from that time
@@aChairLeg😂Is the someone happens to be you?
Guess who pinged everyone in the discord server?? this guy did!
This is true news for once
now ping everyone again but this time for no particular reason
@@amadii8768 done
For the new video or begging people to buy the Opterons off him?
@@aChairLegProper evil 😊
Old Xeons were mentioned. I am running a dual X99 rig with two E5 2699-v3 CPUs from 2014 that are each 18 cores, 36 threads, combined as 36 cores, 72 threads. Each CPU was only $36. Its got 2x4TB m.2 drives, 128GB RAM, and an RTX 3060 12GB GPU, running Win11Pro Lite. It honestly handles absolutely everything I want or need, and there isn't a moment of any hesitation or delay. Even games have very playable FPS.
The problem with older server CPUs is their single-thread performance is horrendous, and it would bottleneck your GPU and lowest 1% FPS
This is a great system for multi-tasking (encoding, rendering, tons of chrome tabs, containers) but can't recommend it for anything requiring fast single-thread performance.
@@alpuhagame It's not my daily computer. I have a 14700K for that. I built it solely for the purpose of building a dual CPU box as I have been wanting to do so since 1999 when I started a dual Intel socket 370 build, but couldn't justify the cost of the RAMBUS DRAM compared to my at-the-time paychecks (lol). When I saw AliExpress had $36 Xeons with 18 cores, and dual CPU X99 moboos for $89 I couldn't pass it up. However, even with gaming, I see 100 to 300fps @ 1080, so it's more than capable, however, I could use the box to heat my bedroom in winter. With that in mind, it was a fun project to try, and I am just happy knowing I built a dual CPU rig after all those years. The box sit powered off most of the time.
I have a dual 16 core x99, even with a hacked BIOS to unlock the multiplier I'm only between a 3950x and a 5950x on CPU-Z Benchmark. And you can snap up a 14700 non-K for well under 300 that blows it out of the water, pair that with the $30-80 Z690 boards I keep finding and the X99 is only good for the LRDIMMs approaching $0.50/GB in a 32GB stick. I'll put it this way, at 240watts I can do the blender BMW render in ~24 seconds using the Thermalright Phantom Spirit. And for single threaded I am getting more than twice the processing power of E5 V3 out of the 8 P cores.
And Yes for gaming just get a 12400 for $100 or a 12600k for $150 or so, and realistically a brand new 12100F is still better value than these Xeons/Opterons 😂. But I just built two systems each for under 900. One is 3080, one is 6800XT, both have 14700 and 850w Z690. ARGB fans etc.
Get a cheap modern CPU and you'll see how far we have come since those old server CPUs. You think everything is great now. Drop in a new i3 or Ryzen 5 and you'll be like wow. Didn't know there was so much performance left on the table. Granted you can't beat $20 for a CPU. But it's also cheap because it's not worth anything.
issues here . low clocked ram coupled with the fact you are not running all ram channels . also numa nodes split into 4 8 cores . also there is a setting on the board which will let them turbo as high as possible on a few cores while dropping the unused cores to lower clocks :) i used to work on thes and once properly set up with ddr3 faster then 800 mhz or 1066 they perform better
Crazy that a ryzen 3/pentium 4 core current gen beats the 16 core in single and multithread...times have changed hey!
Their single core use is really bad and they're not exactly 16 cores either! Theyr'e the old shared resources design that killed bulldozer.
Nah it's more like current gen 4 core vs 2011 8 core 16 thread, as its faking having 16 cores per cpu.
My understanding of the 4 numa nodes is because the cpu is basically 2 cpu's on a single package. it's pretty jank so that's how she goes.
you are not wrong at all. IIRC that is why the chips were so darn big for the time.
I was about to say, given the era that these came out, the size of the package, and the amount of power they drew, I would not be surprised if AMD just strapped two smaller operons together and called it a bigger opteron.
When I was selling several hundred CPUs and RAM modules a week, the opterons stacked up and I couldn't even give them away... Waste of time to even list them.
Yep, looks like they sat quite a bit. Is that what made you move on from selling ram/CPUs?
@@aChairLeg Naw, I had a supplier in a recycling center that was buying them from his employer for cheap and offloading them to me in bulk. He mentioned one day that he didn't have as many as usual because he didn't want to "get in trouble", so I had to sever the relationship. Real shame...
I recall these being really good for compute density for VM's that didn't need to be super fast, when I worked for a large banking company we ran a few full racks of supermicro 2u 4 socket boxes with 6376's and as much ram as we could squeeze into them and these were used primarily for XP virtual machines for some very old software that the company didn't want to spend a few hundred million to get remade (some of this is hardware costs tied to having to replace a mainframe if memory serves) and so on, and honestly for that, they did really well, I don't work there anymore but last I heard they did wind up remaking that software to run on modern OS's and have since consolidated that into a pair of 2U 4 node epyc boxes haha
I use them for vmware esxi.
I've seen them out perform intel at the time for media encoding... by a long shot as well, talking 800kbps vs 200-300kbps of encoding, it was a lot back then
@@Denbot.Gaming yeah they were doing pretty decent for that way back when
I did 3d modeling on a "work station' mother board 4 cpus and 128 gb ram. I think the bus speeds are what really hold it back from being faster than my new laptop 10 years later.
Old Opterons are a special layer of Hell.
They are beautiful in their own hot power hungry way
@@aChairLeg As an owner of not one, but TWO 140W Opteron 2360 SE on a Anus KFN32-D SLI which hints at another set of hotplates that is the nvidia nForce Pro 3600 chipset plus the I/O companion chip, I wholeheartedly agree. Oh and did I mention I paired it with not one but TWO Geforce 8800 GTX in SLI that add ONEHUNDREDANDFIFTY watts ... EACH?
PS: please send help 🥲
@@Knaeckebrotsaege 580 W power consumption and the CPU takes more than the GPU? You mad, buddy.
@@aChairLeg They are not beautiful in any way shape or form. I had 8 Servers having dual opterons running Debian 4 back in the day. Calling these systems unstable is an understatement.
I assume it shows 4 NUMA-nodes because each chip actually has 2 seperate dies on it (so 2+2)
And indeed after the lawsuit, CPUs of this architecture can pretty much all be considered to only have half the advertised cores, each "core" simply takes too many resources from the other that you can't really call them separate
I continue to regret every time I do a dual CPU build.
Why? Genuinely curious
cuz it is never good for gaming @@Antassium
I have nice dual CPU build and I know people happy with 4 CPU !
I built Opteron servers about 9 years ago and they were amazing. Most of them were 32 cores x 2 CPUs and used them for SANs and ESX hosts.
As a gaming machine, yeah, it's not gonna do much, but for a home server/docker setup, it'd likely get the job done relatively well depending on what programs or containers you were running.
I have a dual Opteron 6380 system with 256GB of DDR3. It still has more memory bandwidth than my 13th Gen i9 which is actually impressive for its age.
I do have a dual socket Opteron 6380 server that has been my main hypervisor for the *longest* time. I paid $10 for each of the CPUs, and I got the rest for less than $150.
To be fair, it does everything I need; but man, it SUCKS power like it's going out of time.
EDIT: You said you got the same for $10/piece and I feel validated lol
EDIT2: You also said the bits about power; validated again lol
It just depends on the power management on the board and voltage regulators. i have a supermicro 1 cpu opteron system, and thew whole server uses 80 watts. But my dual boards will never allow the power usage to go this low, as the lowest power usage is always above 220watts. Now i just use the HE or p versions of the cpus. I keep core boost off, since it over volts the cpus.
The big thing at the time for opteron was the ungodly amount of memory bandwidth available in comparison to other cpus. It made for a really solid system where latency as measured from network input through cpu, disk, cpu and back out to network; not just lower, but more consistent even under high connection counts. This was right around the c10k challenge era of the tech boom iirc.
Oh... I should probably also mention that (with caveat that I didn't work directly with opteron, but had friends who did), the openly available documentation for the opteron at the time was rather complete. It is one of the reasons the opteron was just a workhorse that you didn't hear much about... shit just worked, even on Linux, bsd, etc. day in day out.
Messing with Opteron was always an interesting idea to me, but after seeing these results (and the price), I'm glad I just bought and upgraded an R720 for like 250 bucks lol
OzTalksHW did this, so did a few other Low Spec tech channels. The consensus was that their single core was way too low for gaming, although they could game a bit, and even the sheer volume of cores was slower than modern 8 core CPUs. It was a fun experiment to watch. And it's also a fun experiment to try out. That said, those prices reflect availability.
Where's your Ebay page?
I might take them.
I'm not actually selling them right now haha. I may eventually though
@@aChairLeg Ah, that's a shame. Thanks for letting me know.
@@aChairLegbut put a link to your eBay page anyway. Are you selling anything else?
Totally something I'm not interested in, but what an intriguing 12 minutes! Thanks for these videos.
I'm glad multi cpu workstations are mostly a thing of the past now.
My workstation PC used to be dual CPU but since 2021 i only have a single CPU setup for engineering.
I daily drove a Dual Xeon setup for a long time as my daily in an old Dell Precision I want to say T5400 series precision, cant remember the series it was, It out preformed my, year old at the time, 1000 dollar processor and motherboard by over double and maybe more. It got done with a week plus Handbrake Workload in less than a day, and it only cost me $200 shipped to me. When my processor died a year later I engineered some cables for the Video card and it was a work horse ever since.
I think the reason it annihilated the workload was it was 4x4 single thread twin processor setup with 1MB of lvl 1 cache per core. At the time most processors didn't even have that much cache total on the die, thing was a monster. It's still running right next to me with a debian install, and when I had to ship out my new PC it still played most stuff great with proton. I'm going to set it up as a render computer at sum point and get it back to work.
I want to add I'm not shitting on AMDs line, just people shouldn't look down on the old workstation stuff. If you know what you are doing it can really work well in anything.
Nice coverage.
Old hardware uses a tone of power
LMAO... 'Please buy these from me.' 😂
Yes good man 💪🏼
"Not, no. Not you" lmao
Thanks pal
Putting together my Dell 7820 with similar specs to your video a few months ago. Hopefully it goes better than this and I can learn a bit.
I thoroughly enjoyed those systems back then. SLI/Crossfire, loads of ram. The games i played were more GPU bound and while there was faster back then for my enterprise workloads, they weren't cheap. Funny the Phenom based 12 cores were faster than the 1 for 3rd gen based on piledriver so i put them back in my home build. I had 4 socket setups as well and plenty xeon based systems for work but at homeI only used these. Theres a Supermicro Board with modded bios u can use to overclock those chips.
Been an amd fanboy for the most part since socket 7 (k-5) but ever since i built a dual CPU Slot 1 (Pentium 3) desktop, i never setup a single cpu system until Threadripper. Typing this from EPYC based Desktop.
Speaking of Opteron,I had a socket 939 one at one time. It was a single core chip,no threads.
Another old xeon mentioned. I have a ThinkStation S20 with a Xeon X5675 and R9 290X. The xeons were very cheap and same with the gpu, and it was definately a great workhorse. It's not my main system, but whenever I use it, it's a pretty nice system. It's running Windows 7 but no doubt, it will definitely take Windows 10 just fine.
I ended up with 3 seperate opteron systems at one point.
They weren't bad for the cost. But modifying a case to fit one of the quad socket mobo was insane. That one's dead now
The single socket one got deployed as a nas
And the dual one has 6 gpu in it running ollama, it is slow to load up the models but once everything is in vram it is very usable.
MY old FX-8350 System i only Retired in 2024 ran great for last 10 years (over 92,000 hours continious use nearly)
Also that was overclocked 800mhz - Ran at constant 4.8ghz @ 50*c for 10 years.
How did you manage to get 11.5 years of use out of it in only 10 years? Do the fans spin so fast they dilate time or something?
@@talibong9518 Huh? I said 82,000 hours ? :S , Sorry!
Or should say I meant to say "82,000 hours" stupid numpad keyboard extension..
I was exited to see the numerous SATA ports, because I have been looking for a way to connect a ton of spinning hard drives for storage
I have 2 XEON systems that are pretty decent. the CPUs cost me 20 bucks and I installed them on regular ASUS mainboards after modding the BIOS.
Intel Xeon X3470 on ASUS P7P55D /w 32GB DDR3-1600, GTX1080Ti
Intel Xeon X4570 on ASUS P5Q Pro /w 16GB DDR2-1600 GTX1060
I have been on team AMD for ages. I intentionally was rocking 6 and 8 core processors when 2 and 4 cores were standard. Those CPUs aged like fine wine as Windows scheduling got better. Single core hasn't been my primary concern since 2006. My latest build was an AMD 5800x with a RX 7600. The cost/value ratio is unbeatable.
I remember that DreamWorks used opterons during one of their Shrek movie productions
What a trip down memory lane. I remember being a fairly junior technician in the early 2010's and having to work on dell chassis that had these exact cpus. They were complete garbo. CPUs would be an often issue with any of the servers that were in repair. Frequent issues with memory training. All said they were heat-buckets and a sorrowful CPU to deal with. 10/10 do not recommend lol.
AMD K6 K7 Phantoms's ROCKED compared to Intels... Cost-effective, performance and that OC!!! COMPLETED BANG FOR YOUR BUCK
Oh man.. I bought two Opteron 6386SE and a Supermicro H8DGU-F because it was the only affordable motherboard with 6 PCIe slots I could cram into giant case and shove it into our storage room. Loaded that bad boy up with six GTX 1080 Ti's back in 2017. You're damn right I was crypto mining. Was absolutely not worth the headache and the heat that thing spilled out. Surprised it didn't burn down or blow up my two PSU's it required. But honestly the Opterons cranked out some CPU optimized coins quite well for how old they were by the time I'd bought them.. Yes I was definitely not paying for heat or power at the time!
Those aren't passive coolers. They are supposed to get airflow from the server chassis fans. do not run them without fans...
The issue with the Opteron server machines was the "inertial" of folks being used to Intel in multi-core servers, and many companies taking a while to even find out Operons even existed.
I had opteron at a time, and most bad thing about this cpu is single core perf. I mean in cpu-z it scores 150, when core 2 quad scores 200. Also cpu is build with 2 slices of silicon, so its actually 2 cpus welded with some hypertransport link, so the latency between cores suck ballz. You can turn on "cores interleaving" and "memory interleaving" in bios, so all cpu nodes will act like "one node" - it changes how pc behavies a little bit.
I somehow managed to end up with a dual socket Opteron board. The older ones before the package had a large footprint.
Ran several Minecraft servers on that system. Had to play with core affinity to get the performance required.
Sold it. Buyer let me know it died about 6 months later.
I have a machine that I paid for with my own cash in 2012. It has 4x 16 core 6272 CPU's in a TYAN B8812F48W8HR motherboard. Power consumption was super high and super loud. I still have it but haven't powered it on in a while.
I got a kinda broken asus server plate with 4 opterons on it from my internship
one harddrive slot was broken
I'd be curious to see how it does with compiling C++ code.
Well that was an interesting way to loose lots of money 🤑
Amd FX 8320e/Asus 970 Pro Gaming Aura still running as my test bench.
10:50 well to be fair there are not a lot of CPUs that support avx12
Good video thanks for being so honest. I looked up the specs of this ship on Passmark and the single core score was pretty low - only 1334. This is kind of low for gaming, these days anyhow. I can see now why these are not preferred or equal to some of the xeons out there. Take care.
I main a x79 setup for my workstation with a 10 core and with a tiny little boost overclcok it does 1440p editing very well
1:17
Lots more Intels were made in that timeframe. So it's all about supply vs demand.
The 6600XT is a weird choice not for any type of power but because it has a PCIe 4.0 x8 interface and you are plugging it into a PCIe 2.0 board. It will be bandwidth starved. PCIe 2.0 x8 is, well, slow. Any 3.0 card with a full x16 interface would have been a better choice. This would be Turing and before on the Nvidia side and Vega and before on the AMD side. Although the Radeon VII Lisa Su 16GB edition would also have been an awesome choice. It's PCIe 3.0 and it has 16GB of HBM2. Best bang for the buck? RX 480/580 and or Vega 56. But if you wanted to run 4k to relieve the bottleneck a 4090 would have been fine as well.
4 Numa nodes because it is litterally 4 CPU's. 2 per package. You need to make sure you spread the dimms across all ram channels or some cores will be without local ram.
"psuedo owner of my discord" he says as if the person in question wasn't given ownership as an april fools prank
I know right!?
and here i heard Sudo owner ....
Each of these is two bulldozer dies on a single substrate. Since there are four on dual socket board - there are four nodes.
Had one of these a long time ago. Sexy but single thread was slow and i only used it to run databases, for fun.
this video made my coffee taste better. it usually tastes like brownies
Glad to hear that I think
@@cocoabuttervaseline that sounds like some good coffee. What's your drip?
4 NUMA nodes, they didn't have an IO die then, that didn't show up till zen 2. Even Zen 1 had this pain. So if you had a memory sensitive app, you better hope it doesn't need to leave a CCX because over the infinity fabric it went. I have an Epyc 7551p and vm scheduling is a pain, Zen 2 was truly amazing.
Its all about IPC yes old servers have alot of cores but their IPC vs modern processors can do more at the same frequency and ofc much higher.
What exactly did you expect from something with a single core performance at best comparable to a i5 750? :P
I also don't think Games appreciate none unified memory access.
For those that don't know what it means in this case: Each CPU has his own RAM allocated to it but also allows the other CPU to access it, but accessing the memory of the other CPU takes longer than using the own one.
Let me be optimistic! I really thought this would do something for multi-threaded workloads, but that didn't really end up being the case
Really OLD stuff, has a really bad single thread bottleneck issue. Those times. Upgrading from i7 920 to 3900X simply multiplied my per core performance by factor of 3 for code compilation.
And yes, single threaded performance is important for coding. If I change a single source file and then test changes. It runs a single copy of a single threaded compiler and reuses results of previous compilations of everything else. And that single file did take plenty of time because it was using header files that took time.
Im still rocking a i7-5960x on a asus rampage v extreme x99 for solid works.
Shoulda bought from plusboards on Ebay 1 year warranty on all their boards!
I work in a data centre, supporting a wide range of systems, I can say one thing, the AMD systems crushed intel, for stability and speed.
most unique way to heat your home though!
We use a good amount of old Opteron servers as NVR machines. They're worse in every way than you could ever imagine.
This processors where not great for gaming, they are however pretty good for server workloads, even if you are trying to use them as extra horsepower as the server trying to render CAD workloads. But for ease of use and dollar for dollar you might be better off with a classic Xeon in this instance. I had a similar setup in my machine shop that we used for CAD processing and it worked pretty darn good, but after a few years Ryzen launched and well many of these old server cabinets became obsolete.
If you want to use Windows 10 with 2 CPUs you need the workstation version or the Windows server version because normal Windows doesn't support it
Windows 10 Pro will support two sockets.
It would’ve been nice to see this system running a server OS and maybe TrueNas
I wonder if this would be a great virtualization server, I would love a setup like this just to run VMs in my home lab environment
It would probably be better to go with X79, or X99, or EPYC, or Xeon Scalable, or AM4, or...
Even a "first gen" (Ivy Bridge) LGA2011 build with something like two 10c 20t E5-2690 V2 for 15 bucks each would beat the crap out of this faildozer-based abomination for that purpose. Or just... get an AM4 board and a 3900X/3950X/5900X/5950X if you need a bunch of cores for cheap without the 300+W power draw of an old dual CPU rig. Depending on the board you get ECC memory support too
Unfortunately, the bottleneck in this build is going to be the speed of those SATA ports 😊
1:50 You got the problem right there. The Opterons were just as shit as their desktop chips at the time. The Opterons are losing to Xeons with LESS than HALF the core count and three quarters of the thread count.
Edit: And those Xeons in the benchmark were a generation old. Intel released Xeons with 10 cores, 20 threads on the lga 1567 socket the same year AMD released those Opterons.
I have an old Dell server with 4 of the top spec Xeon E7 4870's. It is quite a cool piece of computing history. Very mind boggling for the time when 4 core CPU's were becoming the norm. And with the practically identical E7 88xx series, you were allowed to have up to 8 CPU's working together for 80 cores, 160 threads and up to 4TB of memory. Quite insane for 2011.
Edit 2: 8:42 Those CPU's have 2 numanodes each.
I daily X58 lol. I'm happy.
I'm pretty sure for correct performance you need 8 ram sticks for 2 of these chips, not six. Mind you, it's still going to be slow. Just not quite as slow.
Might even need the full 16, each CPU is 2 octa cores bolted together and each octa core module has it's own IMC, so 8 sticks for dual channel, 16 for quad
so.. even running two equals the power draw of one 13900K?
At least they still work after 10 years...
I like your positivity! After three dead motherboards they do still work!
@@aChairLeg I was actually hinting towards intel 😅 since their 13 and 14th gen CPUs are... well
Let's say... deteriorating
sunk cost fallacy: the video
You say that opteron CPUs do not support avx2, but the lga 2011 v1 and v2 CPUs also don't support avx2. It's only starting from lga 2011 v3 CPUs that we get avx2 support. I will not deny that lga 2011 CPUs are much better though, because I am running e5 2697v2 CPUs in my server. Would have preferred that you would do better research.
I worked in a datacenter when these came out, just hearing Opteron causes me pain. We had so many customers that had all kinds of problems that ran these. They were decent for running VM's on but but for the price/performance now, not really worth. I see other people explained the reason you see 4 NUMA nodes so yeah. Of course AMD turned that all around with Epyc (and Ryzen), but that's a different story.
What driving game is that with the realistic background?
at 2:17 ? BeamNG Drive
In times where I can get an E5-2680 V4 (V4!) for 20 bucks or a little more, why even bother with five years older Opterons for the same price?
Well they make a great heater.
Don't need no girl, my Opterons keep me warm in the winter
FYI you might want to see about that core count being true. I'm too lazy to check but it might be shared components between pairs of cores just like the bulldozer garbage.
guess what? my r5 4600g gets 10k points in cinebench. or any similar ryzen cpu
Lee-Burr-Boot... 😎😎
The only confusing part about me is why you're talking about a sponsored desk and showing computer b-roll lol...
make into cool server !!
Athlons and Opterons were phenomenal chips when they came out. Intel was a huge mess with their Netburst architechture, and AMD ran circles around them technologically. It was only illegal shenanigans from Intel who refused to sell chips to vendors that sold AMD chips that stopped the freight train dead in it's tracks that has run again with Ryzen processors. Athlon/Opteron and Ryzen were both revolutionary designs from chip god Jim Keller. While Ryzen has absolutely dominated Intel, massive power and loads of low power cores have kept Intel vaguely competitive. The same can't be said of the Netburst architecture. At that time Athlons and Opterons were double the speed of Intel's equivalent, and also for the first time CPUS were absolutely rock solid in reliability. They also brought in multicore and 64 bit processing. If it wasn't for illegal tactics Intel wouldn't have been able to sit on their arses for 15 years spending nothing on R&D, and would likely have been dead and buried. AMD wouldn't have had to sell off their fab plants to survive. It would have been a different world.
In the world of Super duper overrated fancy builds, this is what we actually need!
X79 was worth it in 2018-2019 now in 2024 i am retiring my x79 chinese mobo with an X570 and a 5800x
You totally buried the lede. The Libreboot thing that you skipped over like it was nothing is the main attraction of using these old Opteron systems. The KGPE-D16 is the most powerful X86 system made that can run a totally open source BIOS, and most crucially _the most powerful X86 system that doesn't come with a spy processor built in._
5000 passmark for 115w power draw
i mean sure, if you're stuck in 2010 but these days just buy a crappy $100 mini pc with an n100 which sucks 6w and has video hardware acceleration, no noise, uses as much electricity has an LED light bulb
cons: wont heat your room up in winter
old hardware is no longer viable, it's not the upfront costs that make it so but the running costs, performance and noise - which is why they are trying to give this stuff away for free
if you want good value, efficiency and performance just get a ryzen processor, unrivalled
if you want to do something really cheap then buy a broken ryzen laptop build a desktop/server out of it
What's avx12?
nah that aint ugly that looks industrial
I don't follow traditional RGB PC beauty standards
These old processors are pretty inefficient. Low cost yes, but you'll need a huge power supply and gflops per Watt are no comparison to modern processors.
opteron? more like... plopteron
flopteron
@@toseltreps1101I think flopteron lands a bit better
NOPEteron
Nice vid mate