Hmm, I don't know what to say about the AI being good :)) Although to be honest, I did not pay a lot of attention to the enemies. Far Cry is in a strange place now, in the "older games" community, there are conflicting opinions.
@@MidnightGeek99 I find the AI is always trying to flank (could just be a byproduct of their tendency to scatter and roam when attacked not necessarily strong programming) and has good accuracy reminiscent of Rainbow Six. I found Far Cry 2 repetitive and boring though, with dumbed down AI. Didn't even bother finishing it.
@@Netbug Try it with the K9 Vision mod, the AI is even better. I remember the first time I played it, it blew me away. I'm walking along a path and all of a sudden I'm getting shot at. I look around and can't see anyone... another bullet whizzes by me, and I bust out the binoculars... all the way up a friggin mountain, there's an enemy hiding and taking sniper shots at me. I can't remember any other single player game where I was attacked from such a ridiculous distance.
Thanks For video. Core2Duo E6400 - my favourite :) 2012-2019 Huge number of games was completed on this processor. Now i have Core i7 3770, 4/8 4.1Ghz for all cores ( stock 3.7 Ghz, little OC with Z77 chipset and partially unlocked multiple). And 16Gb DDR3-2400 RAM, GTX 960 4Gb videocard. SB Audigy 2 ZS sound. That's my ultimate WinXP configuration. I hope you'll understand my message, because i taught English in school and university so long ago. :)
Great video. Core2 was revolutionary performance at the time. I splurged on a E6700 as an upgrade from an Athlon 64 3400+ and it was night vs day faster, felt like double the performance. Hugely helped by being super over clock-able, pretty sure I got it to 4ghz on a tower style air cooler! Mind blowing fast at the time. Weird to think that was actually only 2 PCs back for me. 😮
i remember trying to play dungeon siege III from 2011 on a pentium D it was a stuttery mess , even retesting on a 1.6ghz bottom e2140 dual core was a massive improvement in the stutters , then when i tried with a q6600 it really smoothed out and hit around 60 fps then
Great video as always! ^_^ I think the Core2Quad series were released soon after those first Core2 Duo's. If memory serves me well, the "Kentsfield" C2Qs, like Q6600, were essentially two Conroe chips slapped together. Could be wrong of course, the memory of those blessed years are getting dimmer with every passing day :)
Excellent video. I had a Core 2 Duo E6600, it was 100% stable at 3300 mhz and default voltage. With 1.5v it was stable at over 3600 mhz. It was an excellent overclocker.
@@MidnightGeek99 I am actually looking to build a similar machine myself. what boards would you recommend on the budget side for an overclocking c2d setup? any input is appreciated. thanks!
@@tylerstarkey9141 Gigabyte P35 DS3 or P35 DS4 are very good motherboards, and you can usually find them for low prices. MSI P35 Neo or Platinum are also good, but don't know about the prices.
@@MidnightGeek99 The later e8000 core 2 duo's are brilliant overclockers too. With a pair of 1066 mhz 2x2 gb ddr2 i can push the core 2 duo e8400 stable too 4 ghz with minimal voltage increases on chipset & northbridge, can look up the settings I'm using tomorrow as its still set up besides monitor & mouse. You can also try bsel tape mod overclocking too push the e6600 & same with the core 2 quad q6600 too 3 ghz without touching bios settings if your motherboard can handle 1333 fsb processors fine.
@@Mini-z1994 Thanks! I have some Corsair dominators 1066, so OC is on the table with some of my core 2s. First I need to get an E8400, but well, they're pretty common :)
I had a 6700 and then my mate got a Q6600 which what seemed liked five mins later.... But was a full year later lol. He had that CPU into 2011/2 where I as begging him to upgrade as he was always so slow in online games like Battlefield. Then he got his 2500K and he still used that CPU until I last spoke to him in 2018 lol.
Hey there, another entertaining video that has sparked some memories for me. I believe sooner of the results that you are seeing with the higher resolutions demonstrate the power of the video card. When the load increases on the GPU it actually removes load from the CPU and you see those kind of non-changes. It's worth remembering that the Pentium D was inferior to the Athlon64 X2. Although the Core 2 Duo was a dramatic difference compared to the Pentium D, it was not quite as dramatic vs the Athlon - the latter generally needed plus 400MHz to hold its own at three beginning. It would be interesting to see as good a comparison as possible vs the Athlon64 X2 if you could. The most common Core 2 Duo that people had at the time ran at 1.83GHz, so I'd be very interested to see how that compares.
Thank you! I too think that the video card i way more powerful than the CPUs, and that's why we see some strange results. I'm waiting the tests with a weaker card, maybe a 7900 GTX or something. You are right, a comparison with the Athlon 64 X2 it's a must, it will follow for sure :) I had the E6400, at 2.13 GHz, but yes, the E6300 was very common, it was the favorite CPU of the system integrators: cheap, low TDP, fast.
great video! showing this really shows the innovation and quick changes at the time in CPU. during this time period i went from a 2.4Ghz Pentium 4, to an Athlon 64 3400+ on the 939 to take advantage of DDR2, then got an Intel E6400, then much later a Q6700 (which i still have with an ATI 5770). i never changed motherboards and CPUs so much within 3 years. also super pi is still used, usually along with ycruncher for CPU testing for overclock and burn in. handbrake is good for a workload test, and the "corona" benchmark is great for 3d render testing if cinebench is annoying since cinebench versions can't really be compared with each other and some are architecture specific.
Thank you. You had a nice upgrade path, I went from Athlon XP 1700+ to Core 2 Duo E6400, to i3 2100... I will add Corona to my benchmarks, thanks! Actually, any suggestions for games and apps that I could test, would be great.
@@MidnightGeek99 a friend had a athlon 2100+ on KT333 soyo dragon platinum motherboard (beautiful motherboard if you see it) and nvidia 5600 Ultra - i think he got a barton 3200+ and then an Athlon X2 at some point. maybe these are useful ideas: 1. total war series (maybe shogun?) because the later versions have physx enabled and many many large scale battles with detailed characters to stress CPU and GPU 2. late game saves in civilization can really stress CPU in gaming because complex multi-AI takes a long time to compute turns and it is multi-core and multi-thread aware in Civilization 5 and 6, so it is a good stress for dual core or hyperthreading performance/efficiency. 3. cloth simulation and physx in mirror's edge and Mafia 2 might be interesting as well 4. the STALKER COP or CS benchmarks are very useful directx 10/10.1 benchmarks, interesting to watch, and have many advanced lighting effects and other environmental details that stress GPU memory and bandwidth, similar to far cry in a way but newer API. 5. also there are the later versions of 3dmark for synthetics but also the unigine benchmarks and ones like catzilla or gravitymark (which is interesting because it directly tests GPU fill rate and draw calls without stressing CPU much. cooler reviews weren't too common back in the day - often people used the stock cooler or bought a "good aftermarket" cooler, but often this varied in design and quality a lot and older CPUs really put out the heat. might be interesting to see cooler performance on multicore CPU like Athlon XP and Pentium 4 and if the stock cooling solution efficiency, especially since i know a lot of people that overclocked with the stock AMD 754 cooler and i felt they were pushing the limits and could go further with a nice zalman or something, but for $40 (at the time) for a CPU cooler this was too much for them to buy.
@@chazbotic Thanks for the game suggestions, I'll check them out and see how I can fit them in. You are right, cooler reviews should prove interesting, but right now I don't have that many coolers :D
My PC in the living room runs a Core 2 Duo E7500. Actually waiting on a delivery of a Q9550 to upgrade it, even though it’s fast as hell under Windows XP as it is.
The first PC I ever built was a core 2 duo paired with an ATI Radeon 2600 XT, but I made the worst mistake while building. I grabbed a REALLY cheap ECS motherboard and when windows 7 beta and full release came around, my usb 2 ports were ALL running at 1.0 speeds with windows 7. A problem that was never solved by the mobo manufacturer, or the chipset drivers released within the 5 years I owned it. While the hardware tech is so much better now, the AAA games are not. 😁
I had bought a used athlon x2 4200 backbone just before the core duos came out. My x2 could't overclock very far. Also had sli 7800gt with that cpu. Eventually replaced the 2x 7800gt's with a gtx260 and that was a nice upgrade. The gtx260 got even better when swapping the cpu/mobo to a amd phenom 2 940 BE 4 core cpu.
I had the 4200 as well as 4600 and 5000. The 940 BE was a huge leap from them and my first serious overclock. I then got the FX 8350. I stuck with AMD right through their inefficient stage... they kept my house warm in the winter.
@@Netbug The FX 8350 was not a bad CPU, not at all, even at that time. After a few years, once games and apps started to benefit from multiple cores, the FX became even better. The FX series (8xxx of course) aged like fine wine.
@@MidnightGeek99 Honestly, the only reason I upgraded from FX 8350 to Ryzen is because my dad's mobo crapped out and I always give him my older systems. It just made more sense (I told my wife) for me to upgrade. I had to play catch-up and learn the new ecosystem and it took a few days for me to settle on a build. Of course this one is much better but mostly just for rendering and other tasks I usually just walk away from. As far as gaming goes, there isn't too much difference in most titles. Only CPU hogs like DCS benefitted greatly. FX 8350 with a good GPU is still more than capable for the vast majority of tasks for sure. I overclocked it to 4GHz and fed it 32 gigs of RAM... it was a great system. Still going strong in his house too.
Results are as expected, even not considering the Temperature and power consumption. NetBurst was failure dead end, but Intel doubleback and hammered hard AMD with C2D, and C2Q. In hindsight its amazing AMD survived, well they survived Bulldozer series too...
@@MidnightGeek99 Thanks for pinup Yeah Athlon X2 was just ok, but trouble started when C2Q price went down. Comparing Athlon X2 to similar C2D would be fun, but tasking job.
@@RaPtOr9600 Intel didn't ask for much at all compared to the athlon x2 series. Before the C2D launch, amd fx62 was retailed at 999 USDs. Someone could have a way more powerful E6600 at 1/3 of the price...
I'm currently using Core 2 Quad Q6600 overclocked to 3.6ghz with stock cooler, And to think that Q6600 is basically just two Core 2 Duo E6600 is amazing :>
I bet the core 2 quad scores slower though as that is what I used to see in many games vs my mate. Games wouldn't use four cores until the mid 2010s, so often they favored clock speed and mine was clocked faster..... Apart from the Core i7 which owned both our CPUs.
Wolfdale C2D when I built a system with them proved to be quite a bit better than the Pentium D desktops I had been forced to use at school. I was capable of running my E8500 at 4.2GHz with the stock cooler from a P4 630. And in the same PC I found that Windows ran far better on the Xeon X5450 than the Q2Q QX6850.
Yeah, they were quite an overclocker :) I have an E6750 and a Scythe Ninja 2, I really want to know how far can I get, I want to build a system around it.
@@MidnightGeek99 Considering people had the E6600 doing 4GHz, and peaking at around 4.2 - should go pretty high being a higher clocked model from the get-go. Would need a fairly beefy cooler though.
@@dabombinablemi6188 I don't want very high frequencies, maybe around 3.2, 3.4. I have a Scythe Ninja 2, which cooled the Pentium D 940 without breaking a sweat :)
i cant believe i missed this video! it's right up my alley. although not a true pentium d, i had a really bad cpu back in the day. pentium dual core 1.8ghz E2160. i even bought it a year after it released. 2009 i upgraded to the c2d E8600 and lets just say i was an idiot for not upgrading sooner. i went from 80fps in CS1.6 to being able to play other games entirely!!
Upgrading from E2160 to E8600...that's a nice upgrade. Actually, the E2160 was based on the Conroe architecture, so it had nothing in common with the Pentium 4 or Pentium D :)
it defintely shows the beginning of the end of moores law , just think going from a pentium D 945 in 2005 to a core 2 extreme qx9650 in 2007 , simply the absolute largest jump in computer history , i think we can thank crysis for the improvements back then lol
@@MidnightGeek99 that is very true , i remember going from a 3.06 ghz celeron D 345 to a 2 ghz sempron and was like wow this sempron will actually play my games without bottlenecking my geforce 6200 lol
We must give credit were credit is due :) The problem with the Athlon 64 X2 was the price...at least in my country, the CPUs were way more expensive than the Pentium D 8xx.
@@MidnightGeek99 I'm much more frugal now... back then I was at the very least upgrading every 6 months if not building a whole new system annually. Now I try to get at least 4 years out of one. As I got older and parts started piling up I realized what a waste of money it was, all just to satisfy some weird obsession with eliminating every possible stutter from games. Now I just play games a few years after they've released and it's a much better experience; bugs are ironed out, DLC is all released, prices are low, and my hardware always exceeds the requirements. Good times.
I haven't tried with the dual core models, but my 2 GHZ Celeron 440 completely destroys my 3 GHz Pentium 4 HT 631. Oh, but the Pentium can be overclocked! Sure, it can do 4 GHz. But the Celeron can do 3.5 GHz, and it would only be fair to test both with oc. And at that point the clock benefit of the Pentium 4 only gets smaller. (500 MHz or 14% instead of 1000 MHz or 50%) There was exactly one test where the Pentium 4 was faster: 3DMark Vantage. In everything else the Celeron completely destroyed it. My guess it's from the extra thread. And Your 3DMark03 results might be similar. The scaling with dual cores should be very similar. A fast clocking Pentium 4 with HT against a much slower clocking Conroe without HT. But the Conroe can be clocked to almost the same speed while keeping all the IPC improvements. And that IPC improvement is indeed around 0% across the board. And Penryn added another ~6% to Conroe. The benchmarks show that pretty well. Sometimes less, sometimes more, but around that mark.
3:00 "this means that core2 is 50% faster?" It was actually over 50% faster, not 35% faster. 1/5 = 0.2 1/7.8 = 0.128 0.2 - 0.128 = 0.072 100 (0.072/0.128) = 56.25% faster. If CPU [a] completed the same work in half the time as CPU [b], [a] would be 100% faster than [b] because it's work-rate (speed) is double that of [b].
Believe it or not, I've spend one hour today on stack overflow, reading about this subject...I've decided that for future videos I won't be using percentages in such cases, it becomes confusing :)
@@MidnightGeek99 So long as you're aware of which side of the equation you're measuring, percentages are fine. Time = work/speed (multiply both sides by speed) Time x speed = work (divide both sides by time) Speed = work/time Assuming that [work] is a constant (as is the case in render benchmarks) you can just substitute [work] with 1. When you're measuring framerate you're measuring [speed] directly. When you're measuring render times you're measuring [work] divided by [speed].
@@phillycheesetake Yes, and because using percentages when talking about speed, as in time / duration is confusing, not in small part due to how we use language, I won't be using % in the future. Thanks for the feedback, I appreciate it!
@@MidnightGeek99 I wouldn't mind if you keep trying to add percentages since they are useful. I'm sure you can get it right. Something that might make it easier to think about this is, for example, the 22 seconds for the C2D and 33 seconds for the Pentium D result: you wrote C2D is 33% faster, but it should be "C2D takes 33% less time than the Pentium D". This drops the "faster/slower" wording and makes it simpler. Another way of thinking about this is if I take 1 minute to do something, but it takes you 2 minutes to do the same thing, I am 100% faster, which means it takes me 50% less time to do it. However, If I take 0.4 minutes, it takes me 80% less time to do it (1-(0.4/2)=0.80), and I am 400% faster (2/0.4=5 and 5-1=4) we do the minus one because we want the "how much faster" value. Now, comparing your speed to mine, you would be "taking 5 times as long" (2/0.4) or "4 times longer" (2/0.4=5, 5-1=4) compared to me. Both of these are valid and distinct ways of describing it, but the latter will probably confuse people as they might just interpret it as 4 times "as" long, so it's probably better to show it as +400% longer or 5x as long.
@@kunka592 That's right, the wording makes all the difference, and "C2D takes 33% less time" is way better than what I've used, and maybe the wording I should use in the future. Thanks :)
A very interesing comparison. What strikes me most is that the Pentium D is by far not 100% loaded in Far Cry. So the bottleneck can't be the CPU speed itself, and it can't be anything else on the board either as you've used that same HW with the Core2. It has to be some bottleneck on the Pentium D architecture that prevents the cores from getting fully utilized in this game. Perhaps it is indeed the FSB bandwidth limitation.
Hmm, that's a very interesting observation, it could be that, but it also could be the fact that Far Cry is not dual core optimized. Have a look here, and see that the performance difference between single core and cual core is not that big: th-cam.com/video/NFAF_f9HQXM/w-d-xo.html Also, I'm not so confident that Core 1 and Core 2 loads are shown properly in MSI Afterburner :)
@@MidnightGeek99 Yes, as you say, little difference is actually expected in poorly multicore optimized games with more cores. With such a game, would expect the CPU load of a single core to reach near 100%, no matter how many cores are available. And strangely enough, this doesn't seem to happen in Far Cry with the Pentium D. Not even close. So either there is a HW bottleneck we don't know yet (maybe inside the CPU) or the software is not executing code and waiting for something (entirely possible, but i would find that to be strange). I really would like to know if Far Cry can use a single core near 100% with a better architecture like a Core2 or Athlon X2.
@@Shmbler You are right, it will be interesting to see if it maxes out with just one core, and I can test this very easy, by removing one of the cores form the Pentium D.
Thank you! Yes, I will surely do a comparison between the Pentium D and the Pentium Dual Core, I have an E2140, half the frequency of the Pentium D 940 :D
Funny how Intel chose the "Pentium" name for their lower-end SKUs when the Core architecture released, even though the Pentiums were high-end CPUs for a long time. Anyway, I think the Pentium Dual Cores were just smaller cache and lower cost variants of the Core2 Duos, so clock for clock similar performance to Conroe, which still means near double the IPC of Netburst/Pentium D.
It is more appropriate to compare pentium d to athlon 64 x2. They are somehow comparable in performance. The Core 2 duo family is better in every way...
@@MidnightGeek99 I had assembled 3 retro computers, the first based on amd athlon 64 x2, from 4000+ all the way up to 6000+, the performance is very solid. The second is a pentium d machine with pentium d 960 running at 3.6GHz and 2MB of L2 cache on each core. The performance is higher than athlon 64 x2 4000+, but the difference is quite small, considering the horrifying extra power draw. The last one is based on core 2 duo e6600. That one is a monster. I put a geforce 8800gtx on it, which is another legende of its age.
Side note: I re-played Far Cry a few months ago... it's even better than I remembered. The AI is tougher and smarter than most modern games.
Hmm, I don't know what to say about the AI being good :)) Although to be honest, I did not pay a lot of attention to the enemies.
Far Cry is in a strange place now, in the "older games" community, there are conflicting opinions.
@@MidnightGeek99 I find the AI is always trying to flank (could just be a byproduct of their tendency to scatter and roam when attacked not necessarily strong programming) and has good accuracy reminiscent of Rainbow Six. I found Far Cry 2 repetitive and boring though, with dumbed down AI. Didn't even bother finishing it.
@@Netbug Try it with the K9 Vision mod, the AI is even better. I remember the first time I played it, it blew me away. I'm walking along a path and all of a sudden I'm getting shot at. I look around and can't see anyone... another bullet whizzes by me, and I bust out the binoculars... all the way up a friggin mountain, there's an enemy hiding and taking sniper shots at me. I can't remember any other single player game where I was attacked from such a ridiculous distance.
Thanks For video.
Core2Duo E6400 - my favourite :)
2012-2019
Huge number of games was completed on this processor.
Now i have Core i7 3770, 4/8 4.1Ghz for all cores ( stock 3.7 Ghz, little OC with Z77 chipset and partially unlocked multiple). And 16Gb DDR3-2400 RAM, GTX 960 4Gb videocard. SB Audigy 2 ZS sound.
That's my ultimate WinXP configuration.
I hope you'll understand my message, because i taught English in school and university so long ago. :)
Great video. Core2 was revolutionary performance at the time. I splurged on a E6700 as an upgrade from an Athlon 64 3400+ and it was night vs day faster, felt like double the performance. Hugely helped by being super over clock-able, pretty sure I got it to 4ghz on a tower style air cooler! Mind blowing fast at the time. Weird to think that was actually only 2 PCs back for me. 😮
Core2 Duo and 4 GHz is the best duo in computer history!
i remember trying to play dungeon siege III from 2011 on a pentium D it was a stuttery mess , even retesting on a 1.6ghz bottom e2140 dual core was a massive improvement in the stutters , then when i tried with a q6600 it really smoothed out and hit around 60 fps then
I've never played Dungeon Siege III, although I'm a big fan of 1 and 2. I'm going to do a Pentium D vs E2140 duel for sure.
Great video as always! ^_^
I think the Core2Quad series were released soon after those first Core2 Duo's. If memory serves me well, the "Kentsfield" C2Qs, like Q6600, were essentially two Conroe chips slapped together. Could be wrong of course, the memory of those blessed years are getting dimmer with every passing day :)
Yes, they were releases soon after, and yes, the C2Q were 2 Core 2 Duos "slapped" together...but the performance was awesome!
@@MidnightGeek99 Indeed! I still got my old Q6600 boxed and stored somewhere... I guess it'll make a solid base for a nostalgic WinXP rig.
@@Hilislaw Yes, it's great for an XP build...you can easily OC it to 3 GHz also, without voltage changes.
Excellent video. I had a Core 2 Duo E6600, it was 100% stable at 3300 mhz and default voltage. With 1.5v it was stable at over 3600 mhz. It was an excellent overclocker.
Thanks! Core 2s were legendary for overclocking, did not try it right now, but I must, especially that I have some good boards for this.
@@MidnightGeek99 I am actually looking to build a similar machine myself. what boards would you recommend on the budget side for an overclocking c2d setup? any input is appreciated. thanks!
@@tylerstarkey9141 Gigabyte P35 DS3 or P35 DS4 are very good motherboards, and you can usually find them for low prices.
MSI P35 Neo or Platinum are also good, but don't know about the prices.
@@MidnightGeek99
The later e8000 core 2 duo's are brilliant overclockers too.
With a pair of 1066 mhz 2x2 gb ddr2 i can push the core 2 duo e8400 stable too 4 ghz with minimal voltage increases on chipset & northbridge, can look up the settings I'm using tomorrow as its still set up besides monitor & mouse.
You can also try bsel tape mod overclocking too push the e6600 & same with the core 2 quad q6600 too 3 ghz without touching bios settings if your motherboard can handle 1333 fsb processors fine.
@@Mini-z1994 Thanks! I have some Corsair dominators 1066, so OC is on the table with some of my core 2s.
First I need to get an E8400, but well, they're pretty common :)
Again a good video, keep it up bro 👍
Thank you :)
I had a 6700 and then my mate got a Q6600 which what seemed liked five mins later.... But was a full year later lol. He had that CPU into 2011/2 where I as begging him to upgrade as he was always so slow in online games like Battlefield. Then he got his 2500K and he still used that CPU until I last spoke to him in 2018 lol.
Hey there, another entertaining video that has sparked some memories for me. I believe sooner of the results that you are seeing with the higher resolutions demonstrate the power of the video card. When the load increases on the GPU it actually removes load from the CPU and you see those kind of non-changes.
It's worth remembering that the Pentium D was inferior to the Athlon64 X2. Although the Core 2 Duo was a dramatic difference compared to the Pentium D, it was not quite as dramatic vs the Athlon - the latter generally needed plus 400MHz to hold its own at three beginning.
It would be interesting to see as good a comparison as possible vs the Athlon64 X2 if you could. The most common Core 2 Duo that people had at the time ran at 1.83GHz, so I'd be very interested to see how that compares.
Thank you! I too think that the video card i way more powerful than the CPUs, and that's why we see some strange results. I'm waiting the tests with a weaker card, maybe a 7900 GTX or something.
You are right, a comparison with the Athlon 64 X2 it's a must, it will follow for sure :)
I had the E6400, at 2.13 GHz, but yes, the E6300 was very common, it was the favorite CPU of the system integrators: cheap, low TDP, fast.
@@MidnightGeek99Try it wuth geforce 9500gt 512mb card with that era games. Specially crysis
great video! showing this really shows the innovation and quick changes at the time in CPU. during this time period i went from a 2.4Ghz Pentium 4, to an Athlon 64 3400+ on the 939 to take advantage of DDR2, then got an Intel E6400, then much later a Q6700 (which i still have with an ATI 5770). i never changed motherboards and CPUs so much within 3 years.
also super pi is still used, usually along with ycruncher for CPU testing for overclock and burn in. handbrake is good for a workload test, and the "corona" benchmark is great for 3d render testing if cinebench is annoying since cinebench versions can't really be compared with each other and some are architecture specific.
Thank you. You had a nice upgrade path, I went from Athlon XP 1700+ to Core 2 Duo E6400, to i3 2100...
I will add Corona to my benchmarks, thanks! Actually, any suggestions for games and apps that I could test, would be great.
@@MidnightGeek99 a friend had a athlon 2100+ on KT333 soyo dragon platinum motherboard (beautiful motherboard if you see it) and nvidia 5600 Ultra - i think he got a barton 3200+ and then an Athlon X2 at some point.
maybe these are useful ideas:
1. total war series (maybe shogun?) because the later versions have physx enabled and many many large scale battles with detailed characters to stress CPU and GPU
2. late game saves in civilization can really stress CPU in gaming because complex multi-AI takes a long time to compute turns and it is multi-core and multi-thread aware in Civilization 5 and 6, so it is a good stress for dual core or hyperthreading performance/efficiency.
3. cloth simulation and physx in mirror's edge and Mafia 2 might be interesting as well
4. the STALKER COP or CS benchmarks are very useful directx 10/10.1 benchmarks, interesting to watch, and have many advanced lighting effects and other environmental details that stress GPU memory and bandwidth, similar to far cry in a way but newer API.
5. also there are the later versions of 3dmark for synthetics but also the unigine benchmarks and ones like catzilla or gravitymark (which is interesting because it directly tests GPU fill rate and draw calls without stressing CPU much.
cooler reviews weren't too common back in the day - often people used the stock cooler or bought a "good aftermarket" cooler, but often this varied in design and quality a lot and older CPUs really put out the heat. might be interesting to see cooler performance on multicore CPU like Athlon XP and Pentium 4 and if the stock cooling solution efficiency, especially since i know a lot of people that overclocked with the stock AMD 754 cooler and i felt they were pushing the limits and could go further with a nice zalman or something, but for $40 (at the time) for a CPU cooler this was too much for them to buy.
@@chazbotic Thanks for the game suggestions, I'll check them out and see how I can fit them in.
You are right, cooler reviews should prove interesting, but right now I don't have that many coolers :D
My PC in the living room runs a Core 2 Duo E7500. Actually waiting on a delivery of a Q9550 to upgrade it, even though it’s fast as hell under Windows XP as it is.
I don't think you're going to see huge differences though :)
The first PC I ever built was a core 2 duo paired with an ATI Radeon 2600 XT, but I made the worst mistake while building. I grabbed a REALLY cheap ECS motherboard and when windows 7 beta and full release came around, my usb 2 ports were ALL running at 1.0 speeds with windows 7. A problem that was never solved by the mobo manufacturer, or the chipset drivers released within the 5 years I owned it. While the hardware tech is so much better now, the AAA games are not. 😁
USB 1.0? I'm sorry for you :) AAA games are good for making content about them!
Man I laughed so hard around 1:40
It's funny because it's true :D
@@MidnightGeek99 The build up was perfect
I had bought a used athlon x2 4200 backbone just before the core duos came out. My x2 could't overclock very far. Also had sli 7800gt with that cpu. Eventually replaced the 2x 7800gt's with a gtx260 and that was a nice upgrade. The gtx260 got even better when swapping the cpu/mobo to a amd phenom 2 940 BE 4 core cpu.
The X2 4200+ was holding back both the 2 7800GTs, and the GTX 260.
Yes, upgrading to a Phenom 2 should have resulted in a big difference.
I had the 4200 as well as 4600 and 5000. The 940 BE was a huge leap from them and my first serious overclock. I then got the FX 8350. I stuck with AMD right through their inefficient stage... they kept my house warm in the winter.
@@Netbug The FX 8350 was not a bad CPU, not at all, even at that time. After a few years, once games and apps started to benefit from multiple cores, the FX became even better.
The FX series (8xxx of course) aged like fine wine.
@@MidnightGeek99 Honestly, the only reason I upgraded from FX 8350 to Ryzen is because my dad's mobo crapped out and I always give him my older systems. It just made more sense (I told my wife) for me to upgrade. I had to play catch-up and learn the new ecosystem and it took a few days for me to settle on a build. Of course this one is much better but mostly just for rendering and other tasks I usually just walk away from. As far as gaming goes, there isn't too much difference in most titles. Only CPU hogs like DCS benefitted greatly. FX 8350 with a good GPU is still more than capable for the vast majority of tasks for sure. I overclocked it to 4GHz and fed it 32 gigs of RAM... it was a great system. Still going strong in his house too.
Nice work! :)
Thanks :)
Finally, one of your comments got through the TH-cam filter gods :D
@@MidnightGeek99 Why do they do that? Such a horrible organisation, I bet they are hell to work for!
@@ted-b It's because of those cursed URLs :)
@@MidnightGeek99 Linking to a video on their own stupid platform! 🤣
@@ted-b Yeah, this is strange indeed.
I've added you to approved users, that should fix it.
Results are as expected, even not considering the Temperature and power consumption.
NetBurst was failure dead end, but Intel doubleback and hammered hard AMD with C2D, and C2Q.
In hindsight its amazing AMD survived, well they survived Bulldozer series too...
Yes, Intel had a very nice comeback. Also, Atlhon 64 X2 were...ok, even after core 2.
@@MidnightGeek99
Thanks for pinup
Yeah Athlon X2 was just ok, but trouble started when C2Q price went down.
Comparing Athlon X2 to similar C2D would be fun, but tasking job.
@@RaPtOr9600 Intel didn't ask for much at all compared to the athlon x2 series. Before the C2D launch, amd fx62 was retailed at 999 USDs. Someone could have a way more powerful E6600 at 1/3 of the price...
@RaPtOr9600 You bet I will compare the X2 and Core 2, I can't have it other way.
@zhongyangli Yes, Athlon X2s were expensive, maybe because AMD had the upper hand before Core 2, but the victory was short lived :0
I'm currently using Core 2 Quad Q6600 overclocked to 3.6ghz with stock cooler, And to think that Q6600 is basically just two Core 2 Duo E6600 is amazing :>
The E6600 and Q6600 are oc champions!
I bet the core 2 quad scores slower though as that is what I used to see in many games vs my mate. Games wouldn't use four cores until the mid 2010s, so often they favored clock speed and mine was clocked faster..... Apart from the Core i7 which owned both our CPUs.
Wolfdale C2D when I built a system with them proved to be quite a bit better than the Pentium D desktops I had been forced to use at school. I was capable of running my E8500 at 4.2GHz with the stock cooler from a P4 630. And in the same PC I found that Windows ran far better on the Xeon X5450 than the Q2Q QX6850.
Yeah, they were quite an overclocker :) I have an E6750 and a Scythe Ninja 2, I really want to know how far can I get, I want to build a system around it.
@@MidnightGeek99 Considering people had the E6600 doing 4GHz, and peaking at around 4.2 - should go pretty high being a higher clocked model from the get-go. Would need a fairly beefy cooler though.
@@dabombinablemi6188 I don't want very high frequencies, maybe around 3.2, 3.4.
I have a Scythe Ninja 2, which cooled the Pentium D 940 without breaking a sweat :)
i cant believe i missed this video! it's right up my alley. although not a true pentium d, i had a really bad cpu back in the day. pentium dual core 1.8ghz E2160. i even bought it a year after it released. 2009 i upgraded to the c2d E8600 and lets just say i was an idiot for not upgrading sooner. i went from 80fps in CS1.6 to being able to play other games entirely!!
Upgrading from E2160 to E8600...that's a nice upgrade.
Actually, the E2160 was based on the Conroe architecture, so it had nothing in common with the Pentium 4 or Pentium D :)
it defintely shows the beginning of the end of moores law , just think going from a pentium D 945 in 2005 to a core 2 extreme qx9650 in 2007 , simply the absolute largest jump in computer history , i think we can thank crysis for the improvements back then lol
I think that we may also thank AMD for kicking intel's ass back then.
@@MidnightGeek99 that is very true , i remember going from a 3.06 ghz celeron D 345 to a 2 ghz sempron and was like wow this sempron will actually play my games without bottlenecking my geforce 6200 lol
0:46 Yep, this was the exact moment I went from not knowing much about AMD to switching to them overnight.
We must give credit were credit is due :) The problem with the Athlon 64 X2 was the price...at least in my country, the CPUs were way more expensive than the Pentium D 8xx.
@@MidnightGeek99 I'm much more frugal now... back then I was at the very least upgrading every 6 months if not building a whole new system annually. Now I try to get at least 4 years out of one. As I got older and parts started piling up I realized what a waste of money it was, all just to satisfy some weird obsession with eliminating every possible stutter from games. Now I just play games a few years after they've released and it's a much better experience; bugs are ironed out, DLC is all released, prices are low, and my hardware always exceeds the requirements. Good times.
@@Netbug I feel you, I do the same...although I don't really play modern games, with very few exceptions.
I haven't tried with the dual core models, but my 2 GHZ Celeron 440 completely destroys my 3 GHz Pentium 4 HT 631.
Oh, but the Pentium can be overclocked!
Sure, it can do 4 GHz. But the Celeron can do 3.5 GHz, and it would only be fair to test both with oc. And at that point the clock benefit of the Pentium 4 only gets smaller. (500 MHz or 14% instead of 1000 MHz or 50%) There was exactly one test where the Pentium 4 was faster: 3DMark Vantage. In everything else the Celeron completely destroyed it. My guess it's from the extra thread. And Your 3DMark03 results might be similar.
The scaling with dual cores should be very similar. A fast clocking Pentium 4 with HT against a much slower clocking Conroe without HT. But the Conroe can be clocked to almost the same speed while keeping all the IPC improvements.
And that IPC improvement is indeed around 0% across the board. And Penryn added another ~6% to Conroe. The benchmarks show that pretty well. Sometimes less, sometimes more, but around that mark.
Is there a Conroe Celeron? lol...I knew only about the Pentiums.
3:00 "this means that core2 is 50% faster?"
It was actually over 50% faster, not 35% faster.
1/5 = 0.2
1/7.8 = 0.128
0.2 - 0.128 = 0.072
100 (0.072/0.128) = 56.25% faster.
If CPU [a] completed the same work in half the time as CPU [b], [a] would be 100% faster than [b] because it's work-rate (speed) is double that of [b].
Believe it or not, I've spend one hour today on stack overflow, reading about this subject...I've decided that for future videos I won't be using percentages in such cases, it becomes confusing :)
@@MidnightGeek99 So long as you're aware of which side of the equation you're measuring, percentages are fine.
Time = work/speed (multiply both sides by speed)
Time x speed = work (divide both sides by time)
Speed = work/time
Assuming that [work] is a constant (as is the case in render benchmarks) you can just substitute [work] with 1.
When you're measuring framerate you're measuring [speed] directly. When you're measuring render times you're measuring [work] divided by [speed].
@@phillycheesetake Yes, and because using percentages when talking about speed, as in time / duration is confusing, not in small part due to how we use language, I won't be using % in the future.
Thanks for the feedback, I appreciate it!
@@MidnightGeek99 I wouldn't mind if you keep trying to add percentages since they are useful. I'm sure you can get it right. Something that might make it easier to think about this is, for example, the 22 seconds for the C2D and 33 seconds for the Pentium D result: you wrote C2D is 33% faster, but it should be "C2D takes 33% less time than the Pentium D". This drops the "faster/slower" wording and makes it simpler.
Another way of thinking about this is if I take 1 minute to do something, but it takes you 2 minutes to do the same thing, I am 100% faster, which means it takes me 50% less time to do it.
However, If I take 0.4 minutes, it takes me 80% less time to do it (1-(0.4/2)=0.80),
and I am 400% faster (2/0.4=5 and 5-1=4) we do the minus one because we want the "how much faster" value.
Now, comparing your speed to mine, you would be "taking 5 times as long" (2/0.4) or "4 times longer" (2/0.4=5, 5-1=4) compared to me. Both of these are valid and distinct ways of describing it, but the latter will probably confuse people as they might just interpret it as 4 times "as" long, so it's probably better to show it as +400% longer or 5x as long.
@@kunka592 That's right, the wording makes all the difference, and "C2D takes 33% less time" is way better than what I've used, and maybe the wording I should use in the future.
Thanks :)
A very interesing comparison. What strikes me most is that the Pentium D is by far not 100% loaded in Far Cry. So the bottleneck can't be the CPU speed itself, and it can't be anything else on the board either as you've used that same HW with the Core2. It has to be some bottleneck on the Pentium D architecture that prevents the cores from getting fully utilized in this game. Perhaps it is indeed the FSB bandwidth limitation.
Hmm, that's a very interesting observation, it could be that, but it also could be the fact that Far Cry is not dual core optimized.
Have a look here, and see that the performance difference between single core and cual core is not that big: th-cam.com/video/NFAF_f9HQXM/w-d-xo.html
Also, I'm not so confident that Core 1 and Core 2 loads are shown properly in MSI Afterburner :)
@@MidnightGeek99 Yes, as you say, little difference is actually expected in poorly multicore optimized games with more cores. With such a game, would expect the CPU load of a single core to reach near 100%, no matter how many cores are available. And strangely enough, this doesn't seem to happen in Far Cry with the Pentium D. Not even close. So either there is a HW bottleneck we don't know yet (maybe inside the CPU) or the software is not executing code and waiting for something (entirely possible, but i would find that to be strange). I really would like to know if Far Cry can use a single core near 100% with a better architecture like a Core2 or Athlon X2.
@@Shmbler You are right, it will be interesting to see if it maxes out with just one core, and I can test this very easy, by removing one of the cores form the Pentium D.
Accent maxim de roman! :))))))))))))))))))))))))))
Da, apreciez :))
According to Intel's assumptions, the Pentium D should be replaced by the Pentium Dual Core. Maybe you could add it to your list.
Great comparison.
Thank you! Yes, I will surely do a comparison between the Pentium D and the Pentium Dual Core, I have an E2140, half the frequency of the Pentium D 940 :D
Funny how Intel chose the "Pentium" name for their lower-end SKUs when the Core architecture released, even though the Pentiums were high-end CPUs for a long time. Anyway, I think the Pentium Dual Cores were just smaller cache and lower cost variants of the Core2 Duos, so clock for clock similar performance to Conroe, which still means near double the IPC of Netburst/Pentium D.
@@kunka592 A Pentium Dual core should beat a Pentium D without problems, I don't if one with double the frequency, but we will see.
It is more appropriate to compare pentium d to athlon 64 x2. They are somehow comparable in performance. The Core 2 duo family is better in every way...
Of course, Athlon 64 X2 and Core 2 were competitors for a long time.
@@MidnightGeek99 I had assembled 3 retro computers, the first based on amd athlon 64 x2, from 4000+ all the way up to 6000+, the performance is very solid.
The second is a pentium d machine with pentium d 960 running at 3.6GHz and 2MB of L2 cache on each core. The performance is higher than athlon 64 x2 4000+, but the difference is quite small, considering the horrifying extra power draw.
The last one is based on core 2 duo e6600. That one is a monster. I put a geforce 8800gtx on it, which is another legende of its age.
I have a Pentium D Dell E510 machine...it's not great.
For games until 2004-2005 should be ok :)
Obsolete 👍